var/home/core/zuul-output/0000755000175000017500000000000015136704077014537 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015136721755015505 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000342707115136721571020274 0ustar corecorey{ikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD ~"mv?_eGbuuțx{w7ݭ7֫\% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vsT*v (6pk**+ Le*gUWi [ӊg*XCF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIog":ͨ ҄v`tYoTq&OzcP_k(PJ'ήYXFgGہwħkIM*򸆔l=q VJީ#b8&RgX2qBMoN w1ђZGd m 2P/Ɛ!" aGd;0RZ+ 9O5KiPc7CDG.b~?|ђP? -8%JNIt"`HP!]ZrͰ4j8!*(jPcǷ!)'xmv>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8/0%X8Ua0NET݃jYAT` &AD]Ax95mvXYs"(A+/_+*{b }@UP*5ì"M|܊W7|}N{mL=d]' =MS2[3(/hoj$=Zm Mlh>P>Qwf8*c4˥Ęk(+,«.c%_~&^%80=1Jgͤ39(&ʤdH0Ζ@.!)CGt?~=ˢ>f>\bN<Ⱦtë{{b2hKNh`0=/9Gɺɔ+'Х[)9^iX,N&+1Id0ֶ|}!oѶvhu|8Qz:^S-7;k>U~H><~5i ˿7^0*]h,*aklVIKS7d'qAWEݰLkS :}%J6TIsbFʶ褢sFUC)(k-C"TQ[;4j39_WiZSس:$3w}o$[4x:bl=pd9YfAMpIrv̡}XI{B%ZԎuHvhd`Η|ʣ)-iaE';_j{(8xPA*1bv^JLj&DY3#-1*I+g8a@(*%kX{ Z;#es=oi_)qb㼃{buU?zT u] 3,F3,#Y3,kJ3,LhVnKauomˠ_>2h-/ ђ(9Uq EmFjq1jX]DןR24d Xwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O >EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(|(4s\9#.\r= (mO(f=rWmd'rDZ~;o\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|/{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!yO){~%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uirP qNUӛ<|߈$m뫷dùB Z^-_dsz=F8jH˽&DUh+9k̈́W^̤F˖.kL5̻wS"!5<@&] WE\wMc%={_bD&k 5:lb69OBCC*Fn) u{Hk|v;tCl2m s]-$zQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1k;A@z>T+DE 6Хm<쉶K`'#NC5CL]5ݶI5XK.N)Q!>zt?zpPC ¶.vBTcm"Bsp rjﺧK]0/k<'dzM2dk–flE]_vE P / څZg`9r| 5W;`.4&XkĴp 6l0Cз5O[{B-bC\/`m(9A< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBp_uFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȗkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0=HH3n@.>C@{GP 9::3(6e™nvOσ =?6ͪ)Bppًu_w/m/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> >X1 smD) ̙TީXfnOFg㧤[Lo)[fLPBRB+x7{{? ףro_nն-2n6 Ym^]IL'M+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q<웰[, qnriY]3_P${,<\V}7T g6Zapto}PhS/b&X0$Ba{a`W%ATevoYFF"4En.OuxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8GwMm[eG`̵E$uLrk-$_{$# $B*hN/ٟCZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXK&4'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҎÒJH+"ūyR}X~juPp- j\hЪQxchKaS,xS"cV8i8'-sOKB<չw"|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$E)~?wy,u'u() C>Gn} t]2_}!1NodI_Bǂ/^8\3m!'(Ֆ5Q&xo 8;'Jbo&XL_ʣ^^"Lq2E3,v1ɢu^}G7Z/qC^'+HDy=\]?d|9i,p?߼=\Ce"|Rݷ Q+=zxB.^Bld.HSntºB4~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v}$;6q6^9.EP-?lΏf3kZc7B[^kIf#[8>VG{;4^l;Pclů ՀCxCSt)6fm'R(*d.^Aw %"nluvOeH=t)Hİd/D"-Ɩ:;8`vU~Ʉ>hX v#'$61ܒZ˜bK@*`*#QA 9WykGk,8]F6{ ^tSȻ \CPwoX"{!9V0tپ_`#U8VdD_GU9V ұ{q:ObUizs )B ۊiX- \X=8OZSܿ* %xbcDa.E h Ƶ:R .qɱmu$I8>^QUAZa$1aH_dx$1'/v^V!i rc/wvҍ$E ECl}U9D.))FIoU&֗K jlFrԋ7EDYpԝ-D\dyj荊=EEk[bØF˩ K9mUxBa"'8T[Jl /K/9,rBAj_TqǘP,:4%_0Eze&O./!Z&p:ˏ!_B{{s1>"=b'K>}|+Z ;au"N@# ń*3_{.g9| {b` N´Ztc> ײ5Kĸ{3Gl& KT1XWX8?C]~We$9{ -.DJ߫7?1a@P5B, ݖc}jcG'Xzө+al H d]k/I,k,ρ|`zR/$@8Vu^scG"E7]qU:ڽUyy >VCѻ!*?xYa8U`J/AcmM~}?yj8TR#s"Q.ϊ+Yrx+u6*27fǪC%+A~*Zآ'+)ܮCy*8¢/Ws1PH9pM.~9gǡs` sB!8 H5E*ep:|Xt`Z)|z&VBNYm\lBN}fU9O  g`֗<eq1.cMʂnB>+q"/[ڲ&Lh!*<-ol0qiԧ {E#A7eVG&EV$_gWb ۞. E[@'nf"db'Z q3)|x!mLaKfKyࢷgb񾍠z}(6>C,HI~'.Op% 8$ c*Dp*Cj|>z G` |]e*&q!t=|X!~ Pu(ǍUeS@%NբZ.SVw Ohgh uЍaRs ^d6GX^V;/+ sXju~NF^)!FLѲ?VK.qE╯YOȱ<~qWP~ RT4&+QR"tJ8ۭSOfbxRKz .[c&Mȩ f(M`,mMr1E L"8_SK$_#O;V taUێ]<&YnrꝤʇ)br\kB`X}nl}hSЕ΀ѩ آUzn޷(ȊD0N^`MDN74ТC>F-|$AZPB8dJU&4Իflq6TX)ى?wg6>r\5TT%~am.>!LcoJrKmzqvmz܅EAZ#u-9`x 92$4_!9WՠZ̓?Wnm>0Es%DƖ|2H\2+AaTaBˮ}L@dr_WԦc>IdA Od[=jlek=XJ|&+-T1n8TڎP$%s,qgt+ZSxToE7U9/nq.JY):Y:7AIU"cKӝ$'qo%\Q!%c5\Z9N4Zxz,dI*ƙ(EfE"`{ipEIՒ9| Olz3;QϢ*:]ք+I&s5w` q:CdʰH`X?"}B=-/M~C>''1RWX%2@KʸH'sۄ`gRpcf:|XUZ#OSt/G~-~o2:u)"\**vdC_ˆdvcƕMlA&HwlF@ա5+F>ΰ-q>0*Oѹ eO/I!m|xV&\b<9$4Nvm^آ]$GBoMjKٕy{H 31Հm-PġUX$[eR6Oœ-m~)-&!883\6y 8V p-lprG]斾-3jsqY~ sj\+9[rAJsT=~#0t2ެf¸DŽ,@2,?WYخNr<V` =V[B5!Z\ļǪ:0A*Iucv8\[|۶s L-ky{ K?_a2=c5%C\d\'2J1(Rve:<+A/VRy6 ö+ML-yz,ZlQ^oAnv-{)xǺ--pcl@Ֆ*Vߓ`ڄ(Nc읠}*وGmU`pi|5Ӄ~ &$yx `qJA"*1 [CplmWu'T0^!Yg5;о߾/5I7wfܛT'W__ݸ//w2weY~(nExuQEuQɥymLa^!JExsse7mqskע,./kuXusxnoUaymGCC 1/B_M{2\-> ]YpsqL5 څB ]>t)^v~(󿴛} 3+׫E7e8AdX-fzS烠1Uܦ$l;olq"ҕ^璓P åqw@gRCsT~$U>ceއE1ZI&[V=͋A,z`S,J|Lh/rʑ>}JXԻq̎'QIC(<ǩJq lc*~;YK OIXA|i*޵+ȗY Mnp6ws EPeF$ό$;NGmݨYl[[7p$$i,*γ'[Jd f l$MQbJ/4_2?V5(D 1.<"*D)øYe\ɻƦ[HcR$+Eߌz|E UZ~SV漹ӕu'<*_jDƪ05 ƛѪAzLBY(~ اE|=Ns$9yiay.)[1]RծV#K8e n|q2@,Gϋ+"˓5 ;yWcH;UG=hcAh ] ]ЎD~Nk 7Ju{YъgLwy9c/ϴ~$_ؠaY/m~j=_*&`,=,?rjǓnb/2?#cW7Y},Xܴ3y$MK>#ßNsOvK'8w4>ħ4' !> = uG{6P3~`j&p $TZRku P D)dH]fϦSS7 ݼ0 ݽL-`6kW%GE3[tWX#Pф1ƀI \oW隖zK &#uX}#iqL7]K7=s_"B$T込F2Kc,Ł1B%q.<n?;Fe278cMF8MbɪE372fA>Ť0c?1?4mC|F,iq6̂*nb?QiSX(=־^OWCe]*˟̈lsEia2|; mݴ,qC6g#;1nF5jg$E͐~FAuNǓXm@(a$D2UW( 0Qy!e,ºC1D֬i%(oJMfaǠ /e6 eLZ/&^դHAIț''(e,&BN kR2h:Su\ɢyPjQ]Vv`,qKr5+o&Q^NRIU oj>Y:H+m )uM~0Ϻa 'f* /E8' i1I _.:3KFS:+b{8_P =S1 + [FQ(};?N y*_N~ D1 TdMvyNW4 `q+ǹ Ʋfp :7$@= ɪxNog1ݵϼQċg+Ean0F% (D bza!VJgB&/Ix,{8q7$ưwzo+qCN:JhZDW"x"wKu+'_r\vY*~[3Qyl^ gn sr3_ZCNlI/`*oʀ[L"{zyg&&=@kjjң!A1uK4`>ex:s^HIZ;LB]w},u.s73Pa |rVu#3˜U__Hԋ .̦Sױgmlk%Gv-?]uz4rk2q 1>9&@h{EknS-xaRâ~0h<u fIPݧYe2$W Z{fE3 KԝJXtC=I&3+дʆq 8muho|iaR><^F4M`FFʲ!Ui-ڧ`9`*[$1B/t5PQC[0߆VW\Zѽ-;SXYS|NC(qQRϾ,tXt e\ + 5 "v0FtJS?/9,(H3޳@d[ @K|?X2B0" O(2zP(pչ"\4ZU' r? .q1QZ s5Ku\@pHdR65\,-1ZE=2]xE# eʍ5QlhĿt B(f> -\VCܭTh=$|fʖdC𩲙R n<ؐGO#*ĖKt{6>I1{R&٣PFIC"}n=~jPw>`4m#'%kkId4_:qJҷ[`&)V"wjf0*6{LVր0 oepUK5 H6ϳk TitBH`zFX5#'}hi ªjf+Qh([jl ( +PTT%1P~E }jh.LWcwțJ^JYg-EN\i!Ҹ ,1eΉ#*-M5?azmϠ[3oZ"CLTUvx[vȂa$fj*/Q{-=x}Id1*2=lrf6灶 ^] 8![N;tiNmm.zM1Ҩ|2 sCXVX("R[ V`6 ̖Y8ݸ+ Ծ&Rk\C˶jEn.%jjC&4ݯ SfJQ/{<@"-dw1<%QU*iMaU)FIS-6} 8%źކ"XN8% #fFAͶ\zxUך<* (YrEFQcimEX\}v~Sm<2.H&\*1 Ad6DUaHV֔ ъ=-Bk+ٻ~4"2WicoՎFD ")U HF65ZU!nZPxyyEfO"Ee*6Ъnd#IѽCϸ34Um] .rϽv&uUl< [IݫξϺf, Rom4 ֫ _MM :Z bX "wP,c Jp$;; {|tV98 [ߺ V5 aY{qoA[PUu.T44Ҫ /pI]'n*L%ەB0g sC"oMЩLɂ;92EKI5Hz )CSXTUCO־as'a O!:S27/Ӫ&n=#Yn y&Wt]Ky3TMƀ:/$ ~+TU㖣uZF^\-ϤQPmXk-}ɵT[jiY|Ѭѳڪq+3{KæJ;Z7C ŧ_]ʝQ._ٰspⰝ>5gMK2G7.f8%n]=ޗ.6}{3]*={cUk-r+?@aб{ۛ[W{Wϋ9#tS`'gߢgO}6ǜz /=(Z(wz_9sznַـ#ty ٰ= 97سǹ;rN;P' ɲ;m9'}ߢ%76j|m🥏(%䭾O6Hhy~׋%9q  $3@0;BzB$8gݶ.Uƅ|,׽ftsgECto8aV^+w%̍JXRL ޔ9xGU)ER63%%IA={cY$|ݸ}3t͈R4sCc<{]2o^ĩ=E2 c-(nE/>ZcoR3Kʐ.&MUp>9*ni14QucnɸXG_MUsL g oߚNeVYs 'b ͧh֬=iQl٬@]y8ʪ/v(qt2/ĨzTNd7nXf=VQx .SDiy|'Ӄ÷?,i]6]Z_Ș8R xtvd|DwC P|}lCܺpWV0 K;Iw0+sGe9 @mw@8NS" YUwn9Q!$ W+ކo "p;<W;NX4|`yG|K5b逗w-ʺnDMk ^'kiQ$jt܂7su*V &@񬮙וja0k9.xvcܵ5/шvi ^A@YڀCY;P: 84쇭9#;_  DD+' ^  1PIJЍ :N:'w[0 <[j?xHaPl}T&u#vk':ZIW/-+0Gu7Si OC ^^]i8[T ϧ]sۘԮX_ǤݶqhqxlMIךw쮜7f/tUmlY\7Ks5\}1:5t]ɽ Pꦋyq Vgi݅(@CڶIt3pDn|(^-P&r#\.@LF s l9%`uf"9 PEo/2PY_кQ/.fg\9TnڈUU.:B4 y[9Mbx֠yB<{|!\dAD,2;%W8 "SV$&7šVU5+/gq=M˰`f}d!wo}u tY_U_~6BDf0mQka5/@) ػ`ρhz: JF WO@zhg[icv7RL&Z$ʢ:I:0=N3tCk⪸`TIӾQ1p m5\)Xa0\\)t:OȌ-O,s`GO1O- IzӿI"ei&&L ?T( J< ܷg,qxj&6&\XZ,7 n챦EtG]_,>Ņ'CED~=(Osa}J!%ll<d'o!`&j,ת*zһx5r5 Եܭaѵ<\X%D RQkU-nTl\[ .T`MԲ.lH;U>\lX:Ify9QVP[x0р?O.~M'I^ |!C>/aUҙ8^;uLhTvb8^!0!Q^0=4U+Lo2XfoN[H`SNy~2%?E0ꀼ]\ܵza7b>e);ΐz&# ft^{7K$9GS˕fBq4kUS!H(9Up.i2cH8'Xa$@l 3~$hn 4t|`P8Kl2Q? ;lB-?,ަU5 z>j C t=iAOCji'OGpmplxzv͖p;7Ѝ$no&&tzҬd=x4@쐸|IFQzKb+ܢFί,W 7`[OV4|HoQȠGˠcE1_"YbGPy< JIIW)"᯺PL(ۂP=l7B3 eҥl>eTcv+۪_W;v}NڷmzZڂXk{ZHI2jFLB-u u ٞPg7BglAL>Pw B2| B$oALz=Po BeB-'ߍP[,lAh=n$4x]8Cq),-t@毓vBO[u~3*3{Mw9/G`: t&n2/r:@wfrͮW9pO>BJC!IoEi!= Њt)U5$8Ð69eQ:u:Nhd e8{ci.@-'le[FPX>kVYvҲ<&ph4cNp14\B(Nb t8&i<۞G dR=#Lj6>ʄ!|o6|3`+#T-zSt$@3`,p.1^5}zԈnXS[4C爫AI2rLi4_dL^rT.%@]H1)|Qf _y`<%\Z#2[L^cW}8dfAgk::KglD'. ŒAԷ@B[opŨ(HEҟ}!m"|3SQ"!JaXHFa=<}:#[y#<(9K%1 zbaBҾKETpEfvޞ٨0x#?OֿVڞ9+*QՋep,E}7~ѭ׶%?"_.A,븖**@ ThA֯J2C'+|Ǩ?t#B0FQ,6/L0 e{|=Hn?fNs=Dmu^L!ykA[ddXQ^ LB,hCw_YOmT$צj0â"cTXќ&zQ ;H( LNk9)U]OW@n0UXCVci+-#voОL]|}\ʨp›QV;Dܪ5$R}_i(e բix2Kh)U{",RE$5|݅ <1GPV /͈$Uñ=ylӅ,I+"DԂ^q\5޾.R9C♢%'asxv$,F6jʞq*,!Wg. :UWV\ ƼT* D(e͸K8ʖiT4,J /)4}l0mVɥLK8VaQǰ*#9]eNڷTvH-SyX&eVr{si"LT@MJekY?\d$i.qBco̤Dk`S2,oHvX6̨%^WSynUɒf5q{,y^$(OR2Rmf\Q=M4?3փ6Y<kJO|%Rz*s~awL1kb_pSᒐ¶,EٺV[g-Z|(ͺBY?EyYцHQ:f*OO- } Уw O<)rd.P!WًTKM#tD;jogw08*)~qF;Ľx|s\Nn%O8vxq!|Q{|,=/߇ Bp>C,q\YiQ3SY4UX[6xqi#=k[*;Imy+ q7¹>ֱ3FS`;Iьւbj /LLXJ1͵\$vddqﱯGYP4MdkXIgڧȼrgp|/`$|"P3[69d-:N,K)ϛ:5qb$\X|uGfIQ8)t}Qh3<ut 45l4m-;Ip8c-ac%%Τ,XH})#7F |zb輹Z?,vk$Ohߘi^Z P/ *{γKZ5tv Mْ(ƚk^RAWe[*k#Rc'u UwqYuTǼFϣ w1b?$8:6Ffrq.R3KQeሑt:m4:mǺ6QgpK1I,%:t#-{-K= ~sU_FjN՘l RtwKIhUEb[t%%#7~W)8H:魼58J+΃ k{0 ###$ ;AlM1c螁INR :{ ڀ1bȽm)z8&y\L{[փ Èao[dKüA$C墩Z7UF{Sq0bޏnIspLs<`A cڜV6;tI=w浤\8\aBik+gu5O!5`$9.5U890jlgƄiwU+BHLH&κfp-H"Tg58GswƧ9pv5ʸWizQXT&ZY~tv >&*q܌Ym0giB|$'[v`l%AYu2 yۡQݎT~ aVysX~4G-Ĭr'Q9׏8nk#qvei QyOko^L-3]Ƭ#Rl m"#<ۥ  W`Uh0 L I#S2t1& .6k`$y <gA<$`e.%Bc"%&CA]s!y>6p4VrLV0%XYk$:m-F$ʱ&#Y%dSr19e%őbqi1!sYtB8~7~T2 F1аd (YL=,kHbϼ[/#xϴEy?x[Ⓥp$Cx}nQ}>#,~-$ޚ=IƅY wW0"Zk! bMۊTy,d`&LLEȓp#tZ݀=d=9̆ tI\Wp])b{~I6 exbПE%ٳ>/$vr0E)\DgDpU g)jj'.mKT1tqq<32Ψ ˽kOte"d5Ռ2.d ckc:H#_3 - *zdRUL?SbMV4+ h}wj:,ޯK$QMn7to8b-Hz8pj 9ߦ7 xxGre4rF#+ozRZTt&!^5IGn;ì:&1.joi ~V^sƇu'%nS3bp~*Rh2FpʹBf??NBH:sܥ{I6L~_jK&a~C?1avvY`~F8=)IfPBSeVy>uz;\$\B}:("3}#ɧ=VOw%NVK2)/-lLH9P;VQ7v?MJ#b[tWpX0F^xQ5!u1NLmJDQ`"chp`aYjJZH:_ecIptw4WxVg^ȶ+3ő8RRS4^V:OG2V_sT.<a-2q [Y?-_5jp& NHL V!B0%JZIai:/fGT#V5Is66'௖m]E&giy~' [Lt`jK& k =G5֩1Z;U>KJ m-#- F72'f0l5 ]uFz_2&rbL xI ^Jw2Y\aF)j6U+#Iߨa@jNNXd`o#,2+0ٛ%?Iuf؈IbzaXbB*s4TL#dl "8g?A+; ic\\ydEVU!ߔmS o=G$|V%k)ѝe aV6eǘ̂6xȞtYvqT_C $!Y#AH:Sr!kW]3ѐr'mFZ YDU~~QIKWƞpDm#ϋ IiI؉y5O"<ыYtJXJXKhLCp[@Rsc$@^I;7XI{ Ǣ_H^npO:svO:vDtN5x}֑Cohj`n8h.g.nO_V:n{#蒘U Ԭyqj2Q^w\X9T%OJERRfi Bkgs׏#CՆٓ;tRޑ8G{&`rR{3I_,䖞>Rh]нEٳd*Zxy Ax`$f&,.%$0^r) FVo՟x}\kFc&u[L(6tY?uƴ 71k)܉viHA)ѲPzc5QyGo1N^gxd Q,5Ca^[A&  ͹`Rqr]P6l=b^'>UgfՐ8tGegx6Ib-+>֗rL(ww28؟td GIvpUlL(8͒n`TC 1p,z}&Y(`6ѷ{;㾴: '  y `w{pGuo/w. hb*1aE%H:vR]3=F;]ܤ  dhYn`!h_%2k}Jeyv[i`BbnJKSPzmqv^i0\sy7x`ŷY=g O[X~xOKVS\{mMAEhkUxcaP 7㕿$?8?z[!YΊH! v` W!sV bĢL.K:Dl" #̆<XyMǽ>ofޱ傤# N ^9݈TY;ˎ.QFx[BN1 b_@ |p}F%pa.x7. 87HPT|V5'Q5IGЗn@W{$ 1-^0<0BWN ^#VL>/置Ld76Ռ]#(pӵ>z׻{3 ANERn}꾱icy`$⼤Qp;+Q ޭ<P'R+=[9BYE) Q^j[l>469` LkhtO;wnWmr٣?3Ųn'G7c>|/AsCs|!K~wm_xf-Ьodms=I?4nZQJ eQcd3[(w8"=HE,71C a7~:I‡|C}QI8g$f:5렁 bN ~8M`$M'<$p%',B |8F&n]W |c6=?{DZ*ƒ1yz ُ'7 ;Y-"\ Cl؛L_CtZ9j,JWF?.,]7ȓ ~v`8&Ȏ'gY(dFq !aDQ9k'@ңR~ 8Qh4=ʩKK=sz7mCѰhV;&߂1yL { Giri=L>arO??Mwpu. ,r٬{Ͳe7/+Ўz'~ ׾{7: 2~g&\1uW ]ؒ.{y]iy7d?Ny8v@8pa(1>ȏwʾDޠ7=}6d4X 🠃n9IvPՁpƻ^WzqG^QA?T !K^F-?`pBxpќ,*к?pN"Qы`FWGA7Ҥ;B UO,Q nW|8%^ A *# Mޕ.zl2 Yt׳hॽ= /TKՌdIP(Nx9\y ŀmrsASo\znvɛEݫ7FrpTbȕgG$eiB:6m@Y CG%PA'sˀ=WFu}JJ/o k(RGp\_d FbIt-"ZĹZ]z2yI#pD "aO@r罉w9 bƳ~H*'_jW7Ÿ7C`tp.kuXU(~/4UyIWD{ ڕBxE+dQLaDxgC3@0X`=?m0LfNZMРN*I>eZ(sJZ dB60wpv^w+F>7Srۃ65V a? ,{ӣPxLuD`-N2s,3ѣED8~%x9l[\o7OQɧwB?ƺz#~O~$Oóʅk `d/<r[qAV.+/^C 㨋u"^_?}.Jpõ:yg՟GlCA?\=_cRLLQpNg&4FƐ2SLR) }.-NN-wڴxt8! ̱ z4Y|i\Bs;oJ?»j̸$3D*7O3ɜ3zs2NQPRl1wN;f4f463ߵfA1g*%ʤ8̺ !GYS2BD걱.'LM;jM;oL(7*[ =%Z(т4 cJ5F8sJ*I2X Z23`5iG4i 4Cʮ֌UsuU`b*K-*#T:08{Zqxf&e+gM;jM;̈-XyAEՃfzfzv}b:7կN T>RT)3qt7I~ "\1%IveVz6 ӾnҁkfíUޠ7=}5elNP@RZ"*jҥN4fva ;ܵm)t\Zθ{]/(Ū~T4vZy~x"<}sx1R2`VGsR Q䁘ލ._L`\[rX3'"ߓ噒HvWWL$tK[Zf%7/m0ݕÇ+ObR>ya>i ;P- h`AMCs9 ժ4:P5 AE(22m47H;a ۻF+R4[kFtkupDjxW!ǷY_s28?TvD_ W{^w DCBcڠhfb87zB-E&Ebz`>!B4T3^Ԇ8Yz;jVbcJe ck$e xٍia?ρ{s޻nu|7l(Te=3(ṙ^v?ç&7#o?7XLFPןNGhw˫% LO5S3w Uw? !W'ƔRt@#,cBPrh{ϻ0bE= !q~<@-a곳!W q\UVhҒbķfOߜi!pz7`Wv˸o&Fkt8z{7!c:yO>cU?; cݘځyqxرڟ-T]̈ߒ{ΈWΈ֖VmmZG(!=ZqV /&#ՄEnj JJ.ȗ TK~×$ U3l2\U&o}Hhl/*V=Ee~/hSr֞MR#$%];AOR5UMԀL1dp.@єo+*h!A1V cl7բYoϕ/^^|9\"Fe)YPŝI1plD+dAC4Sb+BMnM3/zKMAJFB:bםV8T'.Q UHpx ,-Y?ew3GR-ްP<(zbi IOR-FRj+۵dM!")\V2H4AK)JåZ}"m}T\h7B%fy9nyvk)"K6耩y݃=]ӖѸvi55 TP.C]$:R\/[:oc{0dQƻLc3J y\ʵ&D25u%GAa11LRLH yDS Ġ$Mi * ܱ^Kip,u AnԒ"ƈ5R":bJ@cZ3zՑV]oFWp}?IwM$\SK2E]Qrove[6הM586EE*" r¨!KTFk>A*`"2i#Jg"U\gN7QŻ]15:N -/JN(H+njHI5N-d5|9fFL0X-P_G }}B \;Iڰ (OU C}⩿T [%1LKY*,euZU JRIIJ^-(g Jmt pN)G1+`2IU t"4ICF D11B#n[F)ןCv9| ԡmmH)k(Ж5@ܶqhCnH5ݕ!p&H7 `O#޲{MaRk(>^P1Ɖ5%q0Df-/QǰOPxD34ILcMC [qVF8r8Ib͛Ju.T81-F57}O)O[WQA d\>mi|9E(6铟2!0G\V`ecDk+BO'tU˛+ y? s(ҭy?IQ65:o!_!;aZD[b`B4~ᶏFhôh5RUm( YVhZ,e)fR+*&_#-~ l]7-'idM҇ Oal ˄۫1[CLqf+S- |y =}TG)&DfЂ7lAHf ߹Fr̀n0c(V*GB*eV˔yaaIFͨ9#fDʔՈU Xf(U(˘A`XPK%S W蹄IQL% ,&u}g 0R G!TPAa̦< ƦI-+$'i= ,DaH\rjUC /mx9;uf8|qE2Q0uν16ҰOwu8zjbpO'efΡޕ[zDrqu3!޵1\Js; m<$|րU1oyx-{Un-8%'vT_Ё7+o]°d`[(Gqp>Z+h528e-&}힊Sm 1DsQJ$D]\ ޝʻ+K:Zq;Q? C3ԝ=]bf9h7\7VFOEX&xwaCj/jVp,Q-)%]S.[PFV7pnE~ۺ8wJtK#A!V^վtL S|M^@3sYւd*ΝݳӶuYNřVח?'76`Je i-cky"<|Eok{>Lc̜߷W3CΖuYdf~$O^_ǀW/ත]Gj|?,N'SSŐT Z O}3{Gf^,Oœ5V2/!$╾dg))5<_F|O:Ƿ;7b #v})q6\{aÌgfF̕9Q= ѐ.16E9gΏ.Dx1C16[2L+r|y޹>e?޼gхׇ]kC{79Xb~) gLoޞyքvª8՚]X-pŎHiс!f CHmo(Uz{)ݰˀ,pvh5iE`$ÜeD A UJɐ&o9 WtGh=\ (\B$wc+=$cszo$Vnɔr$A4NJnhRI9($ZN~J)_ӄjRA2qʔ )9|kǻGRGokRTligl ňkkHM<\q~z :wʣk"ܫ-96dTXl#\z_8T*.#5;b:*%QGQ S!., Hvf*gMMbx: 1RˇG)0uHS`Ipg4TW4vIG4*nb_q1'Cժ.O}&ua.ԏp~dHWeuKx5#RiT0u _zf7l`L}TBK(e9>Rdb`bZ2SQpnw?s?Ywu6 ^`v/*ohdb{އ~qY,Y,>{ib7 >%{1C(!O'[@k)UCZ3:U<\IRXACfEBgo;S. &/G-(aܙ)8X[wfq0gk11&&F-$ ۸쐪ꌪQoP55τˢ`Ԥ1Uh2&ϋ[ni#voi1 m7dR$/hSvTIj-~P~2:jH Чz J޾^̍l W Nl򧗻}UR"S*7^#zN{]ټ8וCo2&t6w;B4@'9fW_-T}SOnxO[Z6%TJn:c),,)~IzW.&эW?On6\$& ?>6ۿs€2:?-oMƼs4#~ 3IǺL{ޕkWڭ x_z]sAQ$pyXjS#bn}Sf%[@:FC( Wϓs_}mdJ޼xx".}^ ZU>y|nY֟J|Vx8o=jeחs󫂣m2Jg {?w \@ʼZϦ2 Oi"PJ3Z ـ̦~ȒJq}7!78MAF0@KRp 3Z╞7&J>a鹇CԄ0c[=>N"+utJr(9Y6`)҃'t 'IH"j"Ӂ?2BC-U> 9CPE#Gy".f!}уY)};dG0-`i/&N+9\kudlHy5~\/œ0-?.-|S*YF x@*7[wЌ%w^^}T"f#kF7`gZ#C$Wcq)>񥟉Rg,,`WIVrwť\Yݸ=WW m|洒},'6P xb+5A$~kNRL}N'K/ fsw3;Rkkqy6/|ཾòNsQ^Vǧn p[.[F̊_o>|gcjr⊕8{^6.ÌX5nFyK+$h#*0WYߗҫt:b 6DO.WqjccR%Vig4q-MS[jSfJX,#&QɆՋ ~X߹B\n#ɿ*gyyȕhCɧyΗI;<$/5r^L.NjI#ʾ0_+mK%Q)_ۗ:`Ч9txiYvIOP t4Y`-^~>*{]uq5cE.`ѯh݀TKr{1ox1.'~z5 q Wgh[hxU5ijxtM-%\R Z=ьrN?4MCaio_WS7WԘE蘒4RlJKh\+}dNQ%MN l֏ș \:~=^-yQ`鏭ﲵwȸ-joo>o.ɏ7GbؙFfbRwJog[bM ?azv<6"$(V>pf~pdDgݥ[XrsBDrHH۔%2C631S!Wb6%M Ʀ7,]%:"CjʰcQo pޙѿ8 Q "V$mw2Zڑs't1vőOEW\{1~H[aZoLFs )^Hn8σ!%hMDI@O]r[UM̹@Ί2bXF6Q;pa#PU#s6%A9rc-Xx9w犃 [8&ی1x %Ԙ;P'B~_oKO4$&S"ZJMIጔbꀊ# ҳ&]\w,i ;0tp8JHehvBjk|*vF4[~\ D@S!*kKdLqrOwmfD6{q(OŐ?FDŽVW+ F $D wJ,tًy*|*3`wMJQعXTi8K Hd @#yOCI^0#UQYi@C5)&iVG9c{d[Tl[g$m L)_\!,3Q2 9GrTh9;M-Ϟ)yB*\r<ՌQSZyׇ}2+t/wOOۇxlAuͧ?[O"N?_>e~B|:!~ӧN K/'_un7㪰ƥֻ@5ZSyKR̺{ E>qY^XgD]".:;lyO#󠟳D!Ԃ*OKgh 8 O#Ny&J5%3a09$ЙC>T3*;|k{ࠈg")pti}mL)*vi[#N?qaE<Ə6U+0Kv<=7*yɌ8zs|qBQ' ErXNg2J\ EEBOEȚ P1Ă ݚz([0KUSȡY]쵱>!% د%7QS1pJQnE>l>>-tEe Ȩ8@PPׇFr\H&+r>u$ M:x&2$in4|"ϗ{뛑YDh AI`$ܼN2OdXFkS"tmJTg cYnFoNMJ,"YDKi4jMd&GDOn?i׻^q]JT8WfICS6wBfA)YRܭUő椄$.Q##Hk"s+I^1ngrT|go&)7}ӦHi) xYC>t&o啋PD&d5PM^o⹮7 E>Z7ѹDƵӤ9/=I9wD2 " VGeDZc@FF9<\g!-tu6xUa5s),.$8Pe8:yٌ!whƷ]'63g)[s(!8i\uȸf1Cz ^ϑՊa!׹-Z3ь@=v5FB1n|"s @%.+L߃\ڈ4Q5G/l2 28D5sF:XllHhlt#\8]c8|*z̼115 ܌6E(0B-ywj3Rk 1n 5O2(P< |*亞 O 禄*aqEeg@Cd^TP+zh6neb߶܏NǬ5/hŌNM{Ż,0r] +F2*V-wrEU!XͻwU j:RHLd> Vb{!v"_D|ֻ2"9bŌlֻcM0+vK[I%j;. Vs.}"C88h/D&H?v6Ew6|*j|QXS3jM/久(D| MA&ȧBn񊠊/*%(yo@ m%2R/#.5_J-^_=>ޏIBUr\Ze(47΂%g$ lp0ƽlrz<BD5]C%5Z͗L莍(J;j&㽋ыed.ۚüVՊeM Ya>Q 5ڊfFG¨1fƢ3P*cGT͊4Giu7|z@F͊]eW^4rT}@F͡w`Px1Ձo@F}`}`;ة'pʸbAT #^k&2~&`QSV"cWgTXhb:'_pftsGHJJ7#רN2*25u@Y=-*yWfGo>c['1'=ȹ VI ˽"i 2 P@f(ȧB.;MeD3j@FܜkE]vA2+~^I.! 1v-BTtg92&3ĴAGϘ}S""Ck> f?P 385.R{οW9]͆**rzƚB18'Ը\,PXyM`ok/#(z&]]Fo[.oyDB>tļk&8Roe(@!xt#P(j+D(p.33:䏁[wի؉G_3{\m>XrJHcՠE|QGh>GSqogޭAMf^qs=|2T ~f5s.Ƶlڄ2bhrbtjbPGO,Wl^nFj Ȩ_t~c ]PrTW%Xǘ#y@F'4@nt+r3Ȩլ|b\_pf !Pkɦ%pp"3(w{ƏxƏ" DSQZegve.z3+87j8-IJ2Hy (63C>rSC}#ȳ^@[v?q#gu;uuzx&B< fŌ5mJ9jU_!dM喸͠tζ^az_t=[1:cXar.U˾XF>=\ItQoҋJwmvAMXY|κz`X58YkM&AYAVLHihp2"/Xy՘KV2m4>rƶ޴v${XF jaZo-M 5eȨM`}9.si0"=-t12i}Fe^^%\&gOAB`f eqR.:+r+ˡv.Ufrg׮tqYrFsPn-uɸ,:giZ+nBYc @B!!! ҙq3W<H IV/8G'_iU_mӟ)7!T3Pzk`Qi HHTh\] 4cctw)WI5xF(.E 8_J/x-*atdcO O "t%)$V[mIG Ծ􂗅t\^S"tr3YPgz֬ D|\aܚ9a*(g3ʨhG(9=ƫ@ )7W= aH5%-vŇw.z2j}/*>g3{%,/ZG#YG.N8zvh鲶nVT \wP0"h(9 -ToJ7um")e=PZ80l3%o﷝rA@ SJAa(g8q/ZHLUW05(NպYռ?{BjOG^ ;u n[iE>3z Ae"u+û!U8>EmLc̥tzyKD1\HEZ<(]OuQͣ>at*fvKx|g}KNjfWpw2qήulwq"e6QoĵL PIEa:H r+~óZU"+O5O[JuB/ :\>Y( 1$3 ,8͏ e>Y=5, Y*hxyfzkKl(ak`Ŀ8|/C'Zk3דfS γjDѐB/Ȉ;@Cxm9^ \ԥ`{x.c5_DR9OH5KD"H.{kn8&`;mM;u@Ȯ; Z<ız&LbCt7+h5u CP(7 #oMqQ,ܧ0QA#u~tQ?*YW?3Ci.RE )Ȅܐ;2C $Fv)' *h\NRV !Q8znC t>;@C0,czft b1w2' 2ҙ(Z$Z9D1X@!rayk2vBo24s: ,J-;$z'nw!>b$ӹfvrWD {~Ҵ\Ppc(OD4at(O~Zb6-GB{dx{Gq[2vQNg)-~]j#0 b?L A08nbTn; \PU =[SE> *BzV 2bk/!ķQ׿ju,񒐒2DM877&=Ό \`(S=McئMmӦ CPGQ&gE+gfƙNS iް JQ!%=jlUnp">r?/(H~cI:Oi )MK=]Wq49SwPN\ uvC/G䦂w4nFc3  qA9r;$INqMݼ~^&NCi> ā#:c*(gB휦4Ic}p cIR#9C&.p/2.z! =EOu֟` D u[1Fq[4ɕ Ф(m)6A">+>L뵳]wj#Z;:E!~Lc(1U9FA+4{fE #lM%&Ѩc7I 䱏7a-Po㷱O 컐Ҝe ɡzrf) uK8ͱs) >51 1)M(>L R-rڪRcBޒ!ɤ4$xjuP8D4<Po֝`=@TvQaZ?f4'wn^|5Y&J?|:Qd'UFUBd] 'Wq]p{?g$~/?n\Q`Pgͦi]aczuޭW!0+e(R,I"qm.9cDVo~6:.Ȁ +o69+(þO;R+#Pg/)=sCPf ':4եـ$.NpBD&( iCZ |P#>'mz4x%BݮgWWȠ, NrC=_Q2X苄5m- AXufq9Ȉ;@ķ)6Aȃ޶x5ώb jClfR⬏ *JZxhrQ`o嶴<i#6GQ)?oM~\r 6 uv_QߣVO]FCz|CYAnFVtU4U4n5~e Nn^S?sWG|.Uv_# n.v%u? uzib5?(±8"clƻ>w _ֹA BRm "dӓ) }@g1rE^ՊIr3\VW?LCNv'C3cfCE֊P eB.HB) %ޭ6w6B Ec D5: 4AN|k(cDȉ(>l{~4}~1ArzA*wU=HXdW;ٝHƮA H.4lxXb0[q*^8ȑ]G$Zu[|z;/kU6_{ !;/k1P&kҖ0@igM);-"p7LT>]{}Fѫd<RDq3RDw.%Bc\(a< ICի~^ 1Erbn,E>[={G/vrvG4%IQ]߉. IJQOĜĤK\ `&qaD܌YBo9c7jKsplkGmegK P2SOd[c-H7f!.E)a:Ie(v'h ه6t(ka qTQFNj\&T cMR._&΂8bRJa5?N'q=NgDvJGH&fU\K5jBq'U۪ZT_M~Y_^S>ǧQAטT[t]7s$8m;U^EiGuAuWw"LdEn:U*yxKmNm B<?9ĉp?? 64m(G1FƢSj)yZzi<]BsPp]@VFucalt /,G-w)H;ŧ$T\ F& RAx]gYU&f+TMz͟7i j0zbC_b %`03<\N?p|sn9io||$.!&qʶkiqZ4BϳO -yW\l6ˆt%dh2׻iq?sxhӴO>HW]?as. K~onzP nnZW&\ Ijjewzϛ:m/$C h,rThu9_/sߒ%m Gk^: JE#s9prsPmq|Jbs[r4:(rP `n`!P}bW aLfM V,ImF'ˊE56>IyWzitQ^y7立ջ65J RO"x<ӦZ}L;~H[=c$j*+*JbCJ0&H* ¨XVDzI 3n+kƇcրUX ۄVDFI3 B|dq0r,KO/}E?^)QtN;;t5~g*X;aY$>3&!' r _Ʋy 8'[9' K}v(9ukyǕH ׊ՋAYG__ϩZl9(Y.49b%RFD(wIm?Ж߂Emit xG*#vIpt\YsN0we?/Nh=/|/.h3.8b5K#o+ 2CW?=N8*j6.j}N8u3tѻ G5_Ԅi2W^vސi4ਁ]zw|{c!Nn¿;g8qF!ǨE(_ш6,9xgj.8{;LCso=_@zG4hq&p!W%!o L)ʇ?lhYOVc*FR)! (PBNtſ{/--4EKKv:5C%Df ]?k^1 eK'(RC1c-cVeǬN2W(ZPnҀ7ծ?Oz(z ٳDoRR1F;}qR2M#-]\`$sc$Jp$? MY mGY}mh0LgVṼEGࢮE+,y' O^:󒑗%=Wݕp>E(9xĵ Ð{A,rY%7Bq\Z銮ĞZVZcRZE{ Ӻe#=dearzsS09B`=~m1QK9wL;;gQWL>3G-z.fAo]րpZ9 ېnM . +EM30]L1%J!"Hs3@p[7 F4eCoozsY,[0:Rg͹8ZdRu AºŎ(/g!o͹Yc79M@%u>ynQG00UN}Ky:B0s4ޜ ;%ď@HD5xd9( :{/y J_ ~ͯw:-ˊIQ5'="ԫ#<.sC\ۄ^esO}E1b3M~|ɳz`KpK ky EphN b,TPᤢ;t3Z_cMT&DM!z{^b%_CT& 뙱1$SI8'ɹ{ZDS,їgLS\J`dO ^jn62~.x6,m?S/]FsutZwUuf?޷DWf^Ke:8];WVD2V\-r ETP6߲c&> ~V,enG*~f'/s_ fvfM߿>7EȲhM(ɗz|itnfŵ7GM"KB9V[22HYV(DQñV&c*-d]VDa$WQ8G$z ITGΕ SAӭHny"s%!G=q8/%*sh,2 V\'$R&krZ#--c[J />{o刓4Y"ZY7QYT<,s0z'.Uv΢Ok**}y`m9`3nfȫ+/P9|chXBe.; !BiSJQ2rו?OTH[ɷ,F٪7Cyٿitq+fXX+l*D*pIƵڻ-kTzއ߲Q p)GitQ+ZiVPj RkӨ&QRe!6}8iß}o }~8L}G(G'q{7wf7MIK8ᬅRu.]<-x]J‰zj ގs4A!l}`t8Lg6hj@f)q)tWzØok/I ~Z7©NreF1-{Ɔ(6Ʀ) d^~lu3 y4wȦYLz91+b)d; {wW{$!-; O0nh#ozO8Mm䍫uDK2 t,\yX8ݧl9O9*2_h*n2b %w%) d뜄 vO˾)b5%~ !iv=,T<_w7~Db")s$#ZUXHrCՎ,oנ _x׻ S-'QhYܸǷb\p,,Hm:.\_?xPiE!/iD<}iXא*Z2 6KsQq Θ ,xGT:^L{JH;)Q<8+ɀ*'*jXx:ʁ,Ȋ{Ie_ɵc7(vr#f$1^=3݃BYX(&&W %UIcB&;^P$&ܼ3[egD \'S~5PӋ?mCk]n]/ҋT+`݉&&bDÏ|4YcEѮ9c1!D=C(6Oӱm^۲[Ճ?w̡ Vr]#qcD,#;KIyHJp,cKGpk=doHII0‹sm8',a:z ,xq?ro =cJx;tJE\q- rY8c p,K*^58z?n4l~D'ر+W1h{䱪jco }ՁSTfuo 0df3fF>Uv݆w(jb;=8ϳɈ1fu<|C|7ONb@ 9rA&DLQ3%ŗ'K?qez_啡qm3۵b& gVSN0DB&v!2YbZ?ArDsm*/<͌,sQk1`4=&kp򦮴?+r H\@B5@saHAĹE0\6B]Z;륢"[_ R^\]?q8,E02q?]wkeݓ I!j`47-D*  m~ehq6AAڏrR]t{e"~-C͗-Tb@ Ђo$X`ĎavD Q=8MzP[*1rQ:Umj ~_2ʌV߃ʂ56K> `U H㈕HM./2uŏ=;fgvYz=P\e/latSD#j qe3t|*UdRioiER&-Q)*$W&W]( e|Iƀ 9`*\?5 )K6 ԙS(lsO ɻ~A+(1$( YuT@ʑ!Xu2n1.]}'\`"=w ؀F jB Ǖd;,=2,}"P\DA,cĈDguH p=X(~3-}u7[I2^L71`E BaP0׍Z Z^fhC;A u !G} 01xs#" $#8K@,ˎKg_@S(>SbAPG1S?fy3bP顑.`vv\j]sLQFbpx8-tӈp8^ 2Hi_)/ÖF(VRmXlh$G .5QY9LÖp 6dD#٫i=!y_j9 t/+zh$GN Ngc$ӑ RiR2X|8% @!Jr8ʉ?ktnxEj;eC#184>8 ~`H *H%/0Ly5lH9$ɰQGL)ݧ`Tr@٧kr@b$ 6ȧ:%R0Ŷ}pC#18+LPQHoY:H k, W`# ‹_e<;1Ff$B|mVl UDk"#Z-7.=(68(H Du|p0<<Ȏ5}sTr$P}VVՃȩ䜤}V/>eH NrC7KP6Ǿ:rtH JP\A^̧(׬]~TX+ eϿEAǑDL =1 n?rЎc?&&r3_ H C#-81srDV6c|ȍmuQc@ù BFbp8Z~kzVNQuê W?%18r9T(]egMkV"GHQsĨ$r<RI;C#58ôQ֕SZY+P.4=_ SVZTAwb6X@H)v.A.'&Fv%Uհ0Gd g<٫b@=4s4ȩ0oĨwiWUənWZM+ lP.r*\9^tsZW !oЮB!&A *_잶»޵}֯3V%ZrgT1r+[\V\\"xsנТb!>bԞx~kjZΆ E;8%2JpTh \E 64vR .%DrVu, C|}v8Z)Y\J/+l]f{ "*7$/l]1ϭ\yД;l)wשbOUv~3~˨&ca!u'Iĉ?o4->Jbmqy4<"lc1WÖɽU[o>'J+xcD弎Kn|l.wrj9j;-ٴ/w}IVV쇅םWSowiw\YF1oݮ1dWoL QZ`ASy; <:R;syhБI$0@ξl3uP4!=pp8X vr;if3~Agz-KaNUh;$ 'V#e} :/tBj_-SMj\{I].Q3R6p 0RE5DV;PikDwK[O,D?0t;T>c.$ݵ?d_ή]wmP^PcB~BJAhRc,Z)wAY޸U6{Sn7K˖Mmok-j_Z m/s։7aB itǟfn9liaWE,TSlh֏Ζm~#! S jNj0({[t0lZQ ü~Xf&f#D\_NCB> + |=<7a(֮0[C&)Qz <`G}_f_E[ v 0eĻ>> /#Bȯ O渫в38pGAP loM?O!kVXdapON5WzunNȿ'MO߼!O'%zs쬹 1S;˚bzԿU5ڭ凷 M:7#+`ɮo[7o'plSv,K N*S}g%. 'LA'+呷O/~r b4fhT25H AAyS:FҰU5NSv,]knNhY/d֞G\qܧfNVIEz|j :9~i|k4mA֝iN%gB1)۹-K1Ѷxw_nz[Úwm E-^wҝ@,H"l2N(Z=B aMwN\cLë.UMB뙖0ڜUr~On߳7Z ɺ˘XBH0<\l` (ή>鿳p\wH/><ŒߡRO J'{oP<%g s3Lr}^lNp좖1؀Ծ"53+- e5RxH`em%τP.Ia%d0šIEB 0|K-;H(^p^|;nAB )v8pHl5#YkrZH(^^SCSKqzNi| תR*`Z4 1`n]$U#R-^D%#n3]DewP<ú.hI\F,vDH(]u)*ilxo!&!J!ʵd/-P\q๔;(b 3^ќz 1J J@ Rg0.JĄ2#YHղH(^1S %:`,FYne-518˜m%oaǔ OrO9cQFeh.xI#y!} x0xASgZ8y+69q "x*Hi'8Wwം,`5r28?EB oKP6(&&WZ @}5$9gid+PbR=v.J/>E (I#BΐQ +%y-ܾ"(e6Sr)ssX"HA[f& <-=u!>,pwPi%MM !.J״Y{/N./SkՃСw tǴ?toBY< 8ˣF DiU#GUQaJ\p: Pq8♃n b^w %C͊ <C%nD&r@ #-a |W}Q#cEJ-Q>EB '@Ϊju~,MB+Mc}@ %+Juk# 8&xd{y!.Dsh iH ssrRwPa#!#nZ$q.3B#  Z| KL \VNCvT#C`Jyn,oAB B:9,(PpGkkQ} EumH%@FsJ :EH⽔ޚ6${DK"D4>?jbW-a` %sZ|IFs@ ɫ1'IV8o9EBʨ}Ak)bR%*="x?w^`fBzH"+DDu]$O()S`Q .DX H43]j;E)^* Fui'|Xۺ\ݭ?pVvZDAǀ֪UǁGq2Ջߌo (7&bL*gՈjcϹ9fLQIY+2#1G+WeX-|6I/:]C|J8j4-WC0v A1c'$(1ތU_Hi%5IPʩ 3௔{n%^h! 0SQyc׬r8` XO"ֹlR/f3$bHOtQ`mpRHBb5 )?=bL e5ŀʮi:}~r>֡} ՋTUesOMٜ 5~^]BGѩWg-y-O8zo``O JTsu tPz\ju+wrp[ np۠_`|hMx'^OhZ}zp z6?=Jl~0GU:~)oVM׹v2NnCS>g+bSE7MD9kzOiD!-z0Uߧז yZ5!s&bbI8߼3&H,xGT:Vxμ|LRy= ױ."rsɰl?G#S_ͿGG^k ZqSϚmՊp^Y\+u>fC[7O7X \;*o\S?@ RChx\%Dpw{JtƭʞZGGANP6h=gO6B/[,muS[)`^{ ;ԵeSO[qf.cc_O>jʇ"#X)KI3y^Ο>~V>^<4t NlJ 95@VEs6Sr?7V. kIJH\W ٻOl8メIxCQJ!^*p ovS5A `Cw374KMhچ"I[9\' wp)n;mȶ|Dv!pD>0B]nЗ+T{2~O $>bpM 1A)FQ),IrWhXOO|HT1mU&,&X:-s\Q]d7;MSo괴!]gA %lv'iJ.iA~^2ED8sQ3-S{+qј.A\>ev"="BAjq).&<02Y[{b,z'4#Lb(5Lj"$fC$Q%m%fY`do#.3P 80`8sx2D* 42$Ku?qwIG;fjlL|`◓5Y/l;BܓGZawn8}'nwypQeN=ӲyRfw%ryRКFKwG#Lqt X ^8Acw^N(9$xC4-dKk 1}\iw4'm 4Z`ED !撱u~z ʱgaoԋ[Pm2PeL sl`T&W|B1?8"'$S`|^.`e!9崐.}dlR!v^Ff|Sc3MeUNmM7Wui _qt6'UΊ=H˛Urr[,'N>Yx]8]?{ϪHnjwASTW~ճ%~ۍwv/sXCVk9g>>_6B3g\He - ߎy{3mğ$"9P+n>z{*"awYo>ObֻiMMo=.V{|yo~7o%kWڟ0N|wl@csspcpJl05 7J҃>LD~F4{->{$,gv:7nl:7ws->[_K-ۑqz FlcE _-T*ۉ8Չdsl,#[u5k\EuG5O5UφӬZ(MhRhmw=~zM:CW7x{{ *L@35ЏQ|yyWZZWڛ0Y_#yUrlC&\5A&wLBQ}`1*`7H~7x$S Z)>IFti$5<5U4FaJ}r# Skɭ4Cn-MzGNVj^}{ŸE3P =alԤ̍u0JzlYmy ve{8z1RΖ;ff4LSY^B q4l`lǣ@OmnC*Ň[]>J3Қu/Hn8soƓ*i%88?3`;v, Pi2ѯ-~nAxճ^~qWW꫋\\z ?p .x\uncRiho4ӼKӢ$ŀwiהݗ;]E픕ocoz}6 S_p]oGW%1>H2 U]67ɐc'}_u7IYJ#V^sR!k9+mP .f@04=B|#> t0)}6A CRdT\+Ar-L`u' p郓e"D-7DeY/E^6ܧ"ZkShe=L2գ͎KťiޝK,sJ2_ E Ξ1Q * ---q+'K:) H,NL&.}zZOYeNP:@9bJvꧽE&U]R+N n*RXMA !2h!2h!2hj#2h@ Ũ"2haL ZȠ ZȠ ZȠ ZĘі#.dB-dB-dB-dкu&GHz:됆Z$0[)H#%| %#`1☑0.x%p4/(!Rd􎻳ƲwU-#^Ҏ%;\0 ^Ҏ%xI;^Ҏ?F $ݴ@9+JQ|> Hؠ%O,\pr c2$b: < s#93~`WlxdOjc:R.S t8ot,~ K7}g޾9}MȜB_u_ھ =.-#0lmzupcM讱{5+|X${EԹpgLjx(x Vb ng؆r Kødk/O^kaVaϒ>m˓%bl`2߮_N.ce|<>|?O_Y}fb eg8iòh&al ԓ_N 6-#ҘiIM'9,kX4?}OYSXؓ>LeV,A,A,AHjD` ,ALc  ` cBBjm1 ` ,A,AxЈG*)F덼덼덪덦yo3Svy1&C^oF^oF^oF^;@bhnaz#7z#7z#7z:\Z>RqRXr}EIE"eAW(܊O~}\mMgBLm9Y =4|nmQ:m Q2g2T9h2b o!4NlvJ*^&\&.F \h88 G"VS|Ж :>Yw,T4]~z2(rp0u喣ҋ0mn">} ֠-rRJ]:ǜ`#+(7$zRHx % Ea /dZJ㥉F׹:_uC^ /f @^1 y*:FBhtu*PDi,DhJ5hDtQA b{5^tK(qkt</eUwm߁hsЖk4c`2}pBxbFU3e!Y'B" ŴqDQDav]SE: [Eږ85eRn%a c6KJkQݤ?4aS-LwͥUʈl{Nc,Iu٤!_z()lr0zeIB.ABսWWgljx]%wR"J` ZF &^+)/zn6~7<š͛5D'@z Eb ݵ\f3| g Y8$rvCQ"}kM\eHY}m6?\xYh4xYfˡMo+> ӳ9Ҙ¯U_'U|{1t] egݧܮ%@F?\̫7ZN0 B$֖ff ̺E q4l`ǣDVm]3zknu1mn6\tq#a *~F{4יkv,fR2~`Tǯ|^_{Wo;3N^-x`HYۺ`]qPGtZ8uӂЮMs#4-u3mڵ]yK62U@;(R e?\|v^ fE||:>샛eiOh5!*$Е(da6Yh^Q=B++CeB `ÀpE=ew#8t=#s8I|_~m+$͏;0,E0k%(\ I\$eEwiaK eYbPwّ}+U>eHtcMTєe=L2գZԥ2Ɏ4Hs~z&e |-,P4x i.%xRzbpMY_L߭~K#ĥiY/,\C%H.?t+жbnUM$՗դZueЩ$jrMPpqe m]1 oykW {H(X;Q;ZНJ$蒽dg{M؛YR8c D(RtF] ( )ۉn&+^ $OhWOq ? ? ? cZ$IIIqF1'A~'A~'A~'A~'#O$(O$O$O$O$O$O$O$O$!? ? ? r(ղJMP{eWK+k\ren1-Ѹ+!QsʰB uq[% JĭppSXR΂>[0n1#9TyA )"˜HsJ "|0H%Q ELEiTS y;1ug?VI0Vkxr\e5w`ߧbχ[/wT9"x ^h:Uwc~3/u t}矂e]vs@(ږ$\8„2O / B#pyPi?0iI5h'Xx{;+EV4\IM>WyAϭc"p$3_ ~?2~^^[%W# 1@/̪Ɤ?Uqg tߠɩh#Y2n9iLY/q 8,$0O26f+ 'ZB,˾!}s?#έȾT-\˘O?gbUŢM齉nºskrEoҰ4bN,ĽO~vقt1S QWéplWvh:UjOvg'WYW;}@ zOj-9vjm#66;sZdoHǬnEÜgmh}-]YVlEk=? I6[!jŻMS0Ru8xt2r;T?;F7H+}TVݤެQR M}Hꪛ?ou$x6t~GE+}XEfڼhFM5MZE" Q4R(rokK Z{L}M8ti tٖͻ:&0M b;._1$jE+sq|\;ހj ,B׵Ix|ƺR {J- kW ׫,A,ӃME'!Ƴ+1Ņd= J aƜiEHWr[R/#$frJk'Tlb߆}?;]}5&ѦF3ٗO"t< х`Ȝ>}"AMX[Q*0e'V37 ~iia hzO .{tU>^6 lnMo+*^6tV h"]M>y~L&[St-U6.|UMxn?UՅϫ7T-ࢄ1k f&3}1>4: f4^)F4}fggˮV-W«F>o8Y蝋oץ흢!m!.$U]?k+{#+-r;,g ީW%~ Kز?1\S^ϓW/ND/9שqޕr f&_H^ =jꃌ/ϖ7Il/瑝l9^ >ǿ\ihmU+.STSlzoVX9-5NɽtU#պRQE<9+}2O>r3<ăiqg|u{ '~/ MhvO??~w0Q|/î˂mj F#;UfsUVUSTluum6yOowԻ^*vdo|Q~fr3۟x5+fn+0/o)lYDn_V!L# a@`߹p'im'8¸cmۘ-©depu7 ] 3ꐳHsJM~{X?7H`e1d2tHA`8T2H9&)O,`3*ij,ӑSS:!U^4245b!^Hi.x A*r4I3&Kep[df0 bᥗgZg0W)jjWwߺHk>q0jT .PI ?2UT@[+ ~`I_Nu7j1j|ɷHX6l紅+qF=NJXIJIFDBysW9;tawW2qڅ"]82Q1X1cbB¦y[IXsLx7"SD`0N,2Ŝyg) ƶtޚ9aRXӧCz:ލ$yXjk2M'|8˿˾7V>E:DFhd;Ąḡ^6qIΝ1Bmev}+Gη`1xTB|H tNƁN-Pt']:y:vt % &VRnR9fT;|Y0QFbaclPwy~J# ݾ6+ٻ\ j?5Ҕ۪خ7@ +Lxš[d :DCтR5HiFw jpF65f/_+E5Z-m_O3<;|~x{Č==q}X[*:'W͛`yEks6#%@b$F@b$F@b$FN+P7Q #@Z!- 1 $F@b`HHjF6l6l6l6l6P6l6l6l6l6l6l6l6l6lCP< < < VG66l6l6l6l6l?FQ5lC< < < <vXGeEbơ_k͜g`۫S: { M:d|z;c6Ja_pi F9 câ\uLTX˸W$7`=96E~Yϵy3Ooں}ǁt(IG}TG}TG}TGXQAQAQAQAQAQAQAQAQAQA:G}TG}TG}Tȷ:oQQAQAQAQAQA7Zp@G}TG}TG}TG}PbD(}Tu XJ֥7}T]ōiP/e6Z,{i!i1ÿiMLTzg]mR.6'e[`2bq h0_r4OW]};<8dT蒗þ@)B!WѸx7PoL\:z;m&w=择#7Uߘ4+>:)Ej >r`i44m5yed"]}ߊr*ETxpuD5(PV?b|(ǗooUPYXv2;tg/a5^oMwW[4V 7j h⦤x2׃>VseC.ZV[群rB}ݧ{b.{V[G͑Q8FQQQgxF(8.)zp\(\(\>q"Ac4fAc4fAc4fAc4fAc4fAc/Ac4fAc4fAc4fAc{8Z"1 1 1 k,H,ĥ,ĥ,ĥ,ĥ,ĥ,ĥ,ĥ,ĥ,hBP1 1 1 VG4" ,pP,ĥ,ĥ,ĥ,ĥ,h̞~|#QHf_7P])CjjC;X}r9$Yx28QH2ŭld@zM9A~p:Ƿq5= r0MܱY8ΧebMdh qߛ4DT:j|Q*Gz£^|f'nj?>&3OE#5ݺ~Deb%v:Ymd/[{DE t'V#_U7{QNl>zzI޾Z,],rne&ѣ{+ PUS1{9w3h򰍲־Z;<;zr֨>{}7Z7hΗsLJzBTZ"wBM۫'7kelg~7((ebj|R|%i+/˃^{38K!>( C^&,$j\2iXUobuJt|FEݬXb[WM/d#7q{Ǯ}g{̏"GNitf<)2ڢe\'zОHL-':U|N5|8I^<Ҥ9-TFlbhv QkOYL7NUqQ8BOfJkAkg9})9E2Oaj^֞HtwKԮyLRcO'[<-]LL(Y$gDoyک'=G[]rbw4mbun[(Vsy$U6[wޤUz3c-˶D3>ݓT~{I{vpu}ƟHz$*maCRaM^*֟!9|ˍ֟HȀl[v19 {amC=BuܳNm5vsz_m}ҍp$6{R+\k)( `үB>-kez} &8Ʀ'2j[b7hW[Ow39Le#eU;xGG~Kd`̅?I W\߿ ~sUwҗ>s׳܈X>{;QMOZqy n >^S4 J ˻QOAoY/59Y.>k?nrZV5 {Ͱ=پ+9֊ٞhe 3=ɾcB K,6JƷXK63ʎmnMFF+ŚgtrIЕ:K HnOYVܸڦ 1 =W{zM|{~ Gm*Zb-NJ4nS&[XI`BDhim[{Måǃ]O%SwJӑOHYo>O|Lert $Ug}v .zRV ei/jjARtY2'; gD,;^.[Y-Y-^ͿmXLE!c3ȋ/^~ӏzlЧ:2IߛlW<7B>jIese'/E63Y9o]. ?cr>,]r`}S3+Xk8sz߬p]Pų7/:óS#5NBa+0ZifVFe _;unAd:MuZftpu:t? C8C:U0!ȃc*5(K)µDX/5>S"KɟQ__7Mj?C:D,͗}_ X~0-Ί h4UZ4QsXZ,յ?.`ʭU(66moʌOO5ӲҢht F^T`P|14z-^4_g\r,WAI@"A,Aȃ5)INc2ŰRafXTqXFI$d> @}<&e4@[ $>` c֚J"4Wm"uZ[ϯqI;/M(6L'\#r cj,ҠjI#( QVUeo`~s@|I 0v5"?lheB+ I>7g,0f$~d8%M:Y>2#R &~- ̟d$ :y )Q>fkdm簴g8(F"')(uMKۿ0םٳo%P0 $+:mq H ˥CRj*;O6') I"cq&M6_dCN?ld C2p Ktvv:V?rSO'vz ihhbh֊ XBs(U1׳+Ӈẟ'ϧ/&12NXsitQ%o+"d^\0x뉫ڞ M=]uCQEa0,lp0q1Az|g7mpJJYkXl<8u$uA|]jzd2:]QncQEoiTLR63]8_xU޼}ջsLo_{V&yn Lχ  F⇝f757*K׼QK!fCKSQ8=X[ c]rԇv glgU^]\|_A/ o9噛5QP]*E ?w C¯sc`\1nV?)PfG p/yJ_Lo)c:3&c8۟Zg󺍱Ff gexPQC{Nj`obvr ȼF *Pɔ'sE LRnᘸH%v:*%Z>#s)T!35$JIo.ǚtP]LD/j[U}nάm4w܄ĠȩX.&62 e4 X$Q먑]vM&x`N;n@ F xr| L?@;C bM_?gŐ?C&EtӢɰheYd:Be"ϴd?NK My⦷"dӋ',ʑ>~T uKFH:AkL#,F:dftOH1y:2[IШ-pE<}q9>UTBaĎ*%f J+Gf7';/S1a`ZFz,eP P0&iSүC8) 2n8pY\糓2c8GM#=0;?_oJFO}jBK@$\ 1Hoй7)aYO;W 侇%QH v,%#0"4N~~0>ݘ o͝1x4}:9ySn#\Vk##HXđ!: Dbb M Qa"=!Ł5SDc @R%n (aH!8> iojsgb>9;_ßʭitq޻ ?W=;X-##913imHH1(9*o#;=&eFSln+[[cЌ1XKވb--?Ll#+k\- e4W4/RY) [m%GR2t $$\SI$j=ʵ<ೃR D*le0A2X!g?OꝜ73W>( ` Hxx^]6-w?K&/< -`hda6g r\gُGŷ;Y":`>QYLA^KgA7YOːw*xdzR\OHcv=&rg'?J]צ#U0|쒋,TQ2Ywf˭D‰uXT:$dE>q`.Z"_Ka)l5r ZGKKԃqV{tF J}}ד`+HMH$EXf¿}_,*6da>rKO>J,[60}/ˠ&)Ѿ)WV&MH x0s䬹sQwVs{kZ&XPCfzYy~YJY)*[|۪^+|ߓj Vω%[|p-8`U}:S/kћ^Y_e0I*2+. ;d8RP-P\ΌKͽzVh*ܱu:݂xgLj 5>zV&3W ڴ\| =SX>"A/RmAY =|a?y[̻@2qvTmdqUB"!(g(%\ŀX' lN(aC|ˌ `DVN f N =x~rkRk,*P䵕X0E4z<%‹(U٘ mfm oiQL4I JRVaY;  0;M:ͭZjjM]9.b,ȹP!#G#<<ɘS H4PtDvm$xAa#!4ƌ ֑D0s̡tt$"}v [$03ݹ^zI { M%aq`6xTf8P&e;lm @wPʺ Zũ1*0)`̂;O 1RkH!by [p4h_0 CFJ*0qt!uJQ e-N ӱ]0)1izwYKNZD!#$8CgFj`qtUi#ph4 ;@QAt8,X |j`BCZ %'R&m|FbAuQbK2"I UGak MiJR".F9#i7 x#0AFzi) ZiZGLhIbt7 (쁔=U9Ql'hG$ s@y \X+3FBbGo@Ab*ldm$e 3u0^2=z "]d9#Iïbtdsѥ/c6X&2O?_T H! %RHGd+ !Ma >멆R_%(T<]hro (ɡ7f<9~LXKRL1v'FWkVَ",ĥ0,Ƣ Xp5~VD E DMJuP漕ȿt?"+K5:4YXXI 0'AmHҚ [SW,"ر),23KnsH[ؗkV*! r77KB˶;X|/|sb9o>>LPd0 xpff=Ƶ$#رkgUbuuԚ{9&mFm: m\Q}pŒ4IMǹw#(!K*`O~9k9F5Upyn$"7$:إAI;V*` T0 Uڑ DG# ꈰַM2f6Ց??b("^K:Ԇxr7@7Q.PY`.%Ca̤EJQ*#xmWB7g*O!mBULm8 -J܁tpUTVm|g҃bvUવZv䠷5뭦ί] Kȇyn BZi.=d`ڤ@4-HkMyäe j3P/F+BjZ$na!#MKq^= Z i#9 &QqÒbH0Px@.LCma>[.ՒʐriLSDl 8R Y '*_r۩ eWj&Z*myOq]/sw!;i2BI>zL}k.n8(NteʮKw:LP!n?r+?->]#]t>wg3vZ75' }ҋ{7;y{,82قː/319t2JUs-G&_zdQ]a]fV|La-.8xo5F75@h]lzR܊S+6 Z&Ӎ>B_RǠX|345<~}9aMk tW' /S!C~uYy98ݻ@A?Ϙmtw;qWgvwS}yM98|]8;)hj_s]˟/4j~%mUwN&KA)}ʽC?aCjy7n[Zkwo/L?|_uY]6m:翽UWq^5>*RqsfgZF˛v`/ b?\Uݝrqy V4T]]Nagums:n {TʯZ6H͖'~XΥ9jwk>ds\^N}wdcxBo^V)\aݹ,?K,v%y̱]!SLCkS 0gO絛ɸk-E K'Z}2qntדEA()>JlOI.*O0տ׃V)w͌o1GۋlJDݍ:jTbJO̱2xJoABoXY1oa,Xـ _Ryդ֢{jP0ƴ~o#iXC1>H5%UOD3h}YR8~Y1f1VJ|>X'z6* P'X_D5Ǿ@Qc 8csKE~򗎵rpǯ)_P}q}wn}~0ӫ_m,ns mͮGέ7֛}(:i9w{M~@}ֲނ߇'w:zq?<ٯ|+vƇ!`=BLGGc~ýow73xǟo?W%Yu83I6x7Onh+GT$n_J0??i?w7W~{M[m m˫q)+CT/PF~CBxpϟə'y' i39rĪ3lĚ5.9X搹5NWY: IL6g>gO;t=G]\4Ⱥ*-I6URv8Y9+%D7h,?)ݪJШp*7e=`8XYi6k$޴AI:Ȁ&p5ɨI:p ۻ9Qv[Fl{֒V&_;PU,QVUH׆h) mGylN9|SZTmJV>ב[Iů[4sc}Ni43\GdlI$%wFJ$eq3|)w3fx7ϡ%nx.ٖjhe?r^Tc|h뇳K>_'BTVn'2LmT§ PVisetⓐ)z=)1h:Nw ZisX?1 lFIʆnA@yŷJnI2?J{gJvIsCw|_t0n L#Q-$v*I ɔ2k>σ(ARSR^RV*Aw[=K;fpN/G&*4 A(E ;[`} dXnlU zh ]KjCmZcalE@LO (v-J/01J:í.8[8 Z9!p1!Qpv >AVj_`B{ȅRwJI >- x^e43&{:?XFe@I^%iLgPǹ!O<om Nն"E,Tg$!>% YIӰۄh2  $!ex>|@( @lGEt qb /miF%[5e"2-x0Fz s!ڞGHƎb?`) Jx2A=n`EmʢQ:8so!T^HzMri \.6r$Кm Oe4YLgʘQpFa7m%BRhGo$=oEhdNn$[1} K|0$c26 Z:\wWc :LM>jcpP0âb\$$ Zi[Zr\#1CYhI2#Z&l̵<@s=H-@JXEPMCYg&IS' 'CTCx?\s4Y)|=(Ɩ?Ƈ|&0"Fq{Y}kH&3?X@mQ(f, ZHa!Ѓt, SpzX#JXZ4d5zrH20+)-O<<api?X-̺|UvyP0Ȳ >;zaf9Фd>܄ÆpobN`HҰ5: )GD>]w"aUF·(ӋT_vZ K4..mΠBvr5LcnkhFD^npvrsρ'ͬR2m@i~J4vBH{!tpy7159 EX&34Ѵ]-M>_? 0~Zu2ѳʶmSSD~JIڃh+О>{r휌gj'E4uoE4Otp?]:P?E#-WR0EJ>lҽV%(sJ$ md#xP}n~RG} .,}6zT}+ ̩ V5[VYMڻc&; O4\͙>)pX-@`q֩ #f[^jt>;|0iWܠU1bb>+VUWVdc}W ]AȥC{1 YۖqY7=.4qfʝ{^-&]d)s6O8FR6I󼨸s4fz121SHp[Fp'kw~:nMw<={ӥNw 37xHa4lpd--ޫ<=q?]<1'Lbb#{q?|/{ċe,TZsy%IE_-}h;x7 EM锷IBV9q9؄Gr8vg/g9 M$6=$ޤV\C=$jC=$jC=$jC=$jC=$jC=$jC=$jC=$jC=$jC=$jC=$jC=$4@yO=$0\EO :M ֲ!Nx!{H*Z5HKyɩEɺ$f;C=_],/_Ef'˟l~+PJϨI%(&osjcfARR~>z(k*O#bS7F9*L*t7v,!4cc%IJS?#ԥH]h,}p 5C9-;EcۣцՊ]hy4 DCm[08dDjsqX;ټ7_][ d،$Ooo6Qy:BxmDhK(h S2`*q%%q6pv,io7c;ZN^սOh Gh H`z5G2 xӈE߅۵$f+2y|zf8F壙d X2ɠ2gIIL"dJ/+ņgۍ4|pMV!xl ͫbk)Q8 @AKkS*qCS&e(֫N0Q8/\B|j2|K*0" LZ QutlB';W'~j];=K|V Ϫmeb-A,7LTpU*Ue W2\UpU*Ue W2\UpU*Ue W2\UpU*Ue W2\UpU*Ue W2\UpU*Ue Wyj]K~T5EB xR]#ȍ'B. XCȅ*nl><{B.J ,(c]lms7} n Q05k_2P:?[xٓfsukdL.EGqff8B#0[Kd4yW19ѳ$_1g:jDN39"wŠHv/fRl8+a'ʽBrzk\h;N`jcߍ㼕q~e1#` a9=v.xfʾ!_G7ϖtx~yQMㅸ? ۚ P$npAo q#FD2:tU"ʻIwjXMǜpYoAPAc𹗚a#J62&H\ш8@.i9 I7,wxɅv59 z j_jvߊjv~8|uNzt{?HE|DeFtRگ4o޾`'h~ι{\Xy@7P ?vɴ:9.=Wۖ7?k`w2upjw{ͻ8kwnxy˫۩yGTcé;]:=}-,+\} {YhXzzbŷN\!|9 _jF8\0.L.wE]64R^:9Iu4Bd\3`sz(^Qx#FnFB+UkhyƈcF`\J*JYp- dRD}Sb}>5H?q. %:1j| V$mMaɾ6ƛOnd9VtF.R͆7]<ҥ njoi4Nn0J(uZqĈZkzѴmorMu.: Cw5(ô(Q,ef|#MnoJQMk'Z&Sm`0Nu$,Q:4(d'^=~ p䖀]>iS/)gh8^A: 2unb8a;~X4p_p1&Oc;;>mѷme4/cצ~e?܎N+on8>l jm pP;0}[ ]_ HpְT.kL{ n_{ݹ˴M8{8}]Y{M{-,Q /zXhc޹Ѷ'ڻ^ϫ!y *(*mqǺ' 愿G+NժmRWpdK)B4'Rcg֤53O58̼Dz6):$Mשr2de(9ZC "<" vJ*>l]t(4Lj x(c$r1q{cs}&&s36ZEӑb3[^"dr*pW5r`~)b CIgY]=rZb,Mr$qsr< ܶ.H/iTDZu1 R/M2Z좙~6m$鿂җk z_Trdo8Xj3HbkS$HD@)Qd1===O7z X0 esZO]RVqD 6M/(4F&zØ&p$Bs8цy .-)g^r9t%oMMIYS A=He cj,Ҡj҆$!qK}5tSܸ8`&/ՐJ{* ެeȣ}VJh3 B#L靣FS3I8Y/꼺6W  ɊN[lxgrTa=7%0dUݔ~0Vg>̹^|4K9,Npz]0릦Ve%bj` JliNYy{ 3ŸE‹ꃓ:8J 91g`O?zZDmbZ+8ds枆sЦ8G:4 <" ;PDFqYk4<1zsp6^79%7J&iMsK'UFHlCإG_L~,EğF廎+v,=r@wW'/_J߽ϛ/^c'ߞ߽~߀yM$hԚ ZpeN6!0lhXyd\7f͸ SQ87[ Cˋ9ԗdtvzoKŮYv|uϾdU(G*C8u|S|aB`pq |㺏qԦnb{%8b0JS<4ٛ}~`Wb, *0wHsIMl ֏@b|NCNHA`*`(Iʭ7SStTV!c<8u_;:2y٥WOszK(M *ib#[ l ^FpX EA"Еh:%$L/[}북db)0EW.PI Oo*ʱT5 )ȿ X+NWk0!~jLTkgjLeZv"}MЀ 0Z1EyT2RrI0( eCjwd5)X44h)51vV5u6D\:-j2%I$CvSpqR6*0 K[}_ɻ2ކZ9?GJ'[+vԒ;f,իlQcX͂2hq4ч֪DZ"g"mL;zZexE|{ ':]ݺUL,Tbżʏo?/]V>cQ"e6lfWJe&:deEN4dp望@K0sRSAz%P:+DEgvkm.*b`D,|>+yy% OZ#:)Ļ)|I}&)  ר]9NQ ҒPNbXX|源^o-k{f{UԨS=ķP2zEPG Vp).}Ԗx93]{(AjREzl=Iņ'WՉXܜ>&Ӄ ()PQk %.P!`V0/N4EJGt`ZD"rRuaVƒ8H"88fSa3 A2 ('` )I#MGа+!tV!!: ĤH[XA`Ir,aK Sб A˃4iwi{aw4&Qyv}"^!M~$FRIS,U?0s [շ6F$\74F$VaFp+e!,JC ;)0_8M@B:T#ZTjqS-~w]:"2hlr \;,#(J2pWs[d 㱱tt| RCgz؜>ikE}Aj:RjsfH%cN&zG̠SrYƒЬ]iw bw]OO{ֻ)ެ7C"@to$r ߗ"Z=.p\QU<ˡK5-o~wO+lvNיyt|dːGJtP24`6`@N5{_JQ•`=el\è"RqYI~ ^ZLL+NR݊i.U:S>kԶVLzs׼eyg)xf^g6aa`+*]) lr /mK@N~:c8*k)!Q"s&51Ǹ@&aB܁"ʁ &W>"xFP-FaN+78`ʚSq{)j!\ݒ`qJ|1W*@QssU{0W,=D{cjoP:v\%*k1z<`N1#szoU"Wj]KC{sؓ*;JE!5!X[\3ru\Lj ޼_ªR+G"Y.Sw(OMdUuQlAd)7v,Z܌+ c)w?OgG !o|e <2S'qdizax9<ՄY4773ov8B[a<ԋi-=25 ~MLA?&gfW޻0|[;!^!f 5vՎZc`?7WCwH iTO|KL,{"Z EkĸDgt t̃<|GovtMzݎ.ұю~_w MF/vi߉Wܼ:0(h ½J[7Җ֩:݁S:7MѕYU KD'w'm'm~r]l69GWB!Ҷa{@ ژbt3XLwAn|ŧgI%( V_DW9O\%)Q+Į<%*k9O'>h1W\7*Qv\գ4WZiZ zUŇdJKoe8;?+J3L*(NrP[ď9=FtfѴ7Oaq:Ts#iB$[[`b2Eg3veSr0 erLv*Xhދ5+MkLƬ ˸r{fWUw8UJ-h[~f6" <]硃v![X6 rYeVOJ<]إ.En첽"}}d'*k$G9ٟBQ\*\%jus`R>+ X%{Sy%Qv]zq~2E 3z5K0TTY K %I+^J𾖮VVr+sNĔ"! @Her!DFj+6㔺ڻ!W |g)Fk*Z`U.b9JU EJbQmaEcFӴ|W~0giUcՔkG5*ѿx@JhBp.uPH`L&M{At ĕc [X_sԆkmr-'ۈW=}hRu "Z% r*̵G gahTN`̝"Jʘ Ek'`n˃?;|H? wfJ$*8ʝ75<3@c0WL#TJM?iyU@^7M%8GP֓(8k}F+{6:9(E9zf4I$70y sZi 5(o+Lp]\'=tvM3Ԃ \7#+8P:xBg FV! :ƌ}`T3ǖLKhE&r6ȹQj@`;X+\$6y6u@4Ytk0ͪZej; AW 3^eu0lҵ' H*qJ>~cD}sOl,M} hj9y]A"[ 80 sV8BU ܌֯PtXZM*+όqn˻c0b"5|fD2Y+냉KM ` Xy2&"}p_Ƃշ]`ǃ?Nhĝ98< ?1yD|ڙ;y-Xhm 4wFm6*GT KIwP@ _kc\S8Y/k{+M[M&=,Tx>Tyגm>;W<FnBJm pǮ~slaIY3/Qۺ`ZP9,g{HW!O #}0y 1>%)sHC8$% Ic5Uu, ^F3T"ެSAAɃۣ~_s ]EdQ6'pb4 =Z锑 [JZ!F "c).s. ap)L4F.KHJ)µm߂ON-.FΎߌOl>X\pMA};E7v4ûC$}ײԣ;61t,U0(PA {A=;se J2E 27+hfik@xGDJNcR P$JaQiƝVc  B' Ll?; +C^EJh08-a]ЖKc$!dZoքdDh'09|ERSʓv]IY A=(H81VLM5GDi5i*%<* R|wkL8_k ƚA k !gLj]iKS #hsɘX}-OC +IY>FIŸ^'LW]K}g^[ U*s G[Q&+U,3@H)D0tv)3n ,ga^[  ΊN[lxgrTa=l[R1^$. òLgWr)nLT3,|r}sx)K9Zp;[uMʼţ)WEj(UVɏgZ;o»EpvJ53bͷ0WR=C8+xwePHeڑ_?u0yEYd > G5*|4_'NQ  `&ei&JG 睇fexCxT]m>u5Ọ|qZ3|¢g{k!@~ZdqfA(sVY5Hj,"EC^V=gtS> |={Ng2!D!e!x0k5f,`ZFL&zZxXDg|:z??Ծ(CRQk.(|8}]`XZʩR ˽)7ncO}o r]|yN m vYSBfT*eOJF(UJH1 N9ťoW12#XP1g&"keskGQm2M$7L$۝U9O~ 3렟O54hd@aD%u yItK) Ӂi]Iv@># \HL3 r6d#6!awL,u2dL+ jbKFb$S, A9p0I̥RWmXѲN:Iۧ=i ۃPC$\KNNyڹ6 sxZsot$G$NKR^;-r6SHC3KFD§U?X'=(iF+K0qdNÅH V̘TJPgV#dzɐq!2nbi1g;Jaظ?Fv$kxtZC_ G p#M;-Ky5:ƝׇU9+ḽV"#cdD#g6#&c;mځ됪Q{c섷).{xi BM4CUs[BKqkmIX+ݺ~yBRA [(k)!Mdǘac\ DltnϡvHQ$E/oSp<7ݽn3Ϡ*[2~ͅy,}oYՊwF**Bp8PZ*ޑ͇_-KR~cI*cتr&2mHCg$+o=EI晨{O`tyĕݹ5'W7 />I'eRϠo7-s7Lvs|Ր+e^k>+շ:EYX:$fow* @dn-6wS6PwjGvry,w'k$N0'+u' ru~dBn 7ujcI[6 v;0=limcK2]5&*wJ܊ [m7"[mCRZXC_)h~3&&.3 xĽj;Yhwh6̷fOZ3͸{fp$UukUHYg&U+%/D80j{NPiaNDN:PP0oxq٠;q؟d8i6KF1)R]?eYYSrMW-9-q|ga"fb*.0X;%1&0FF&λ NskMPr02 $`}J!LE N&x[C4O%nF|qUyavI}7`ooOF,+G]rJ^VcY6~Qc=3I<$k€ 9m0…^B*h\sH ʂ( ,dsP.!CkIYjwܧX1Dzf[0 j/A6$Q3-9˜R 3d_Y-0Eem/5yDڇ)dC!fH )#Y@S]TknVU;1NMXm Fi7vi ˿aYmbM|7~NWYg8\S׶`_g3Zgڒwa]ʋq+ &H` ʚ/UIq-I- YDg )CcFI:H2i:e}ˊW`ԁkx!nq6Ccz^٘3[E/Yjr9&r6ȹM:2,"1rbr؄b^j{>rSyE0OO^ 9_c[ H1$m\w7ZrONbƝ ? ByJ*\x#.w^΢{^N#;\21KMW 57׻B}T}ֳf 3{gyȄK*Ljd<)V2"",DFQ4`f#eLD~K ]o>v9&.C٢6زVoZiZ+hmv[\b?\YZWg{VV)X)VV`(dy~aMW&#AKjnsǮ'pQxs"ZmqIAo6׭;$+o-Z/]'Q]LiJ_!HI#5G73 ͠Rkf5p$U U5 ڨ0VwLY)RL*0tp`#<@rFϵ,wJp4aN6NUzE>9lQDBmgX_Tc `*&oMa`  ېuue#/`}(bCY6m`ra໋]z^}n~}jM,gCk_`KwW@ƗYFܩKiRɥ"KY8%pj*H 2WƜf9eҴpxű5=x-4Na#QV+,4ro#dJƅZPi<5F`̝"J LG 0LwZ D佖豉hjp1r6.m} ΢3|r8~k5's72/H[Q32*7UnR͛\ r%ie~.1Z~Q O bV+`nf鞵ܼչ]{wn"||癑aﯸ\J7~/n3E$.ބOb%zrD4!>CctW:WǓLv`FSC*6;vo9ՌN敕L̼,A@kN_C/*`ߚG?MmG;'>(_=vjݛq~;*]^ Dгb1G$v|W\>H[ }pmVCwc/\{ [G4*]a*k7ysQWE9+o~^]{{0~QMq3=d?Jv`Ҍ8S4i  gsǿz0Vܵ;\085/%xSXARZYN%d Uv{}; Mw(]Oyй>pb4 =ZifVFe`MXDu;uшhc(DD#gÈD[^[WZ oa}481& *T`c/H`ZKΕ*(aRݞEm;"E Rr"PafXTqLI$ c!P WJo@žu7>a"kKA10Ƭ55S\!NasK{Z[ߋz?kW&zQt cj,ҠjI#( QVUǔ6r4 :-Ϻ!ۉ?f".2!%+".3{2B$t""Wa %ɾU1 J01.|{b0Jꊠ "y Q<؇la3bXf SDNobNp>pFi꼺6W  ɊN[l!rr5Sn;`H]Ve2IBoX|pǾg&/pݘT\ξ6[vs~~6t3WrVnQMMLm0R0!S5:+SZ^y[\x};>uC|B'Ӌu[ ~צ7an7^1U_Ou!Q1ч^QŴ{ݻ1gv>uF&dS Z$Z7W(t]XG҃9n};twOq/fٱtSEWTL^ެ+/gss ~|śߧ߿ǻ^%&߾U/Qp)&[p?\M=7~4UlG&g]1|&\wK;5ņ&kk@~~t.w(tՄ]q'9]\|_A?gCa[Tm*z0l/Lq@]\t~#1V[Ԋ,lɝ1fS4q}z1}_}%?+à z#(wRok3߿+P1MCNHA`T2 9LRnᘸ\` c{>煪"$^L" r4 q/H8," ZGUHPBTd⑽f ]Y]Qԁ'NNkB&Cmz~C#n`nw("OGy+nW $b3`s˭#:F |0jxc4:cha`kGr#B($<3@cĜE %meZ{2կIO=xӔ]Cf]?&ftl·ޞ\=!o ۹%i |u{Kn"'Jќdsk,QpȖiu_XN:?L`ΰU/ƈ rPhZʥ'X3EG@ȪX(3`՘xɢIČfF 0D 5!H߈G۰P B./4& IQwo^mԧ$Pxub;80CIO G% YD\( UxUWhH"5Gx_tdi[n#KDk #d}4LDAVbmhF#CX4ETr$7H^D(*lT.9[Tg͛Dey>;+ 3ɇ-hlf};t &dS{?5L,Z%=iXI+BA@ ]~s-:Yg|zQHO&o/\ҐS7mE>ﻏadmIb_6yh@w IX!)1Hl)߿9(␴8(` rk @Ӈ?/c_9y0M &,.a˙$bU@1)8ZDNQΈ&8/׎j馪Q+NX;`n6hҶr1 ΠihCdkĴR.~>Nõ IpCKx 9F-Si4.bZi>orFߤaZE3bP F |%[4d.MNxC*&H}xr|Q), ;<hXO[v v I֠Hk>W['ܲxΏY*{2ؕß݂c8*%N VK2rn7%i<@dXyEVy[aJ{;'vS Uloà2m[{-iK距}st,eMNyPM>ނcXŌ"#RBDȁb^gO+"E__#t2&gƀN%eLD&$S#FkuN_` Q OV.Y{u4ckby^N. D[*OՍN_}ֈ&zfE>CvʩcAJδTqxLhM,(8 h5D86aZ3w)S?gAmK0.+_kVS7:k5u pu|Y>odA; Q 4?NA7)4< ւ Z8!_/ 51fǘc˺<1cuLqdؘDA˂w^m,JY"m3?Ȗmg\j)c;SnޝtVGƑzh=;0͈J|,^qo' Æ-s.9NkO+[/3)WuaY߀jv2 1o5cdY<Æjq\-XLٴӮ 3mh'\k1Q,xs:EM1oK|__>H?ݶWbk}9-k1w"chзC@1\A?byV/9瞣dH䡶a=v[KH=6֤5ėՄuy&.<Łbo0EO7q؛,?bʾ H.9jW__OxuN[ϻ}eVETç@Ї3}s"Շ)&U߀̶G hʸ-AŠHY+g"cC@eҩA]/hZNZ?tP4˅7) oݭ밳ٻMU|"H.Ӵ-vhXb#VG#$KS%JQ*d4=c^8+~|b +oؠf3Ndq8ON9,s#7˿2v0{ɰ7J"\@<rJ3-SxC! L;Y>!C_cF0yh3[ Ng.8! j H+F$QTC+1gu甐Vot MRFB)+@gN1& @;H c˷ 73[ I̷Q(N&z[X%6Gۧ5ն稵`cᠣőI& 8:$H9i;/PBh'tԊ-<* ,@i6X9Pk&y8 dTH,y4UEv& hE!J&2(.pˏA6qXu\d*Oͬ G/ _Y(#˹s6fw5 >ד d$?}v]I;rSKɂ082jEŔ8I뎑e甌l.%R!v.Xf>͕ qKxq HtPE߃5#rgѧVLq$}*Ca:>~*`R?)|O / ƌabLf4WD1SLc|q$c ;[7xgZh9PV1w6b#P+׃kzfxUU*SQ2s4! Sw!>ƶ8H*,f[vTj ; (X 5Z&;7 BQ*(=U{f_ wݿ'?^n,*gOa01iп)q#nE@jßf];#wqVq%.֯t۴ m^Fe0bQqF3$ 5koMӪqU7Zoiզ*#J*#<@>M|n We% Qy˓jaR 4k9YX>x{}wW7{s u E4` {?הG?QjUJW>e) K6Gy@_m ؄,m}=.WuNQJ=)&͇a}X+m@q WMc$(ڊ#gXk&2FgNo~mq5NLIN,`@A͑ ge  ԧ\ƔxX:ޑ|FyNU^gSP`r uKrvǠ%Xz"#20I$ 2.Z:^3{i,K?|>WQg;oZtZvVR:D`gѰRV!$%틂+K_@qTͯw@adJ BInGxjLGQVdXP>~*.sYDri^$@ۂ"o nX/|H@H|݆2֎w¼/ݪ xg{;:ZnB܎YbcA+3G(p, u)r-,A9 JTP9u*.U(HX/0x[z 6JTGτ6֐Ȝ 1I ho ho-biRDQ^`KCX` ca,00X` F(00X@e ca,2V c"i0X-6KДJ {c;nx7[`q!֓PqKӥo8OR/*QjEv9'r?8Qs%]2q%HkW9-CŒ Ct9bK"+nE˛@sr%PԱc&[RI⃏Rx Azu|{q8`9+r%AZe2 b a:\ 1P6`\[lp Ҵ}G7a<OPF*<ф(D.62֍ƻ|',4j %Io\RV;M 6W?d%['ۂ yc ܶ uEXPځ('s/DGʀaTdm24D :#ZxH*]7.c\%Xiaʻ-gn(U1)8YDNQ"cQDwZ0M'f'H"T,ҏy2?1T;_P:W ~6/U?ߟձ<vM G; V 0jǜ|3KۍZgnFe S䢢LbVECAWrVmșE~YE.H7M` >lI#^tV='cA?$Ϧi\{(چm {xއ^}ʏՑӅ^Mf4m%P{=Ύ{H ^er_/pCl-ɠ|ػjgX qQDFx7_8/r7qHfl40m<}ܪQ'">OH}s0ÇOh"G!i/(*)IhLc%WDž{N gTϬ << ;?|k@1# ONbER#-CQ$4z3P-ġ[\ZN3f&Ġ&"8,y$jv|I޴&'s\Ōwrs&!@b4h1*6aG#Zb4YNo^k䬏N |N|} h<, iuPy3\3KR r\1h7oeǜVĩjcI\F-Ƹ$71dAEV"okަ-Ӛ&XWzh>bxzƾ-(a:0(LF_k΁P":DυĘŔN@j-;x)먂P1k42c‹3j2E'E`JjPjF|=ACGC"!y)?p>u7w-Cɳ.ֵ(&LL&h(8Thh_UkƃA(5\RC9?q4rK$K KD,!}Vuf<^}^^fXYGûJ̞|5Y&Bo󏩀><|u|ޠSǂiϩ".DXPq*lDZN,@gcmm¦ޠ uҥS_Qh:nuns0iEQ3kWCOm']&tQ`tY"7#@n_dQ@;Kj.|?d햠fтoӜ.9,ry"Hf"ϺH:?`8P,|2Qi Ko{tQ/R6za`nۆ2 zz C9N뒖[M6+G'UR44W5 ft|3.gc{Ev27za`枏M p=ڈ{f3p$5M LP3j={i; osǜ2D-ط+4ݎVx1k*m֙6b}mk}W6 7@y5VԸ8Ox[ }rmD zB$R$Ŕa[IF B[=XSKd+}o>Nnol ;ڮy]O!?M.GJǤTgO[G(3=^c];m uPs1zRs#sjVZL#TS~M.׿Il'/ĨAGe 1`\b,bld;w!kbNM18l-. 锑LoL*Vhc"NN :gЄC0*Y,xR9*[FΆ=jZ&B\Zל ~}S=7ڒ\G7ᚊJ*1Y Z36.דMγSgȕ:tm!>2O[]5ɣQ=J͊s;Uۨvynj~ +72 x~8H?'_]s``dso"yƚןyyn{~7I'q퐝yϭG VP`0O9Th0uThR SP*3'2r5f2 JV@ߢFTLR'2Fj5뼺TM+9yL_{C. +&[9[/Ņ``n?_~h8McHy4Qi÷~o1'dUғQ@TԴĄ;@%7%jNn&WJ/j{c;nx7zx,y3`eEG .MXwq\%zΐ02|g@/<np ]0Կe2Kfإn[Y9eYMu `>bD!Rl$(<%|Dէ2sN.o:R't'Z3Oww 1UZ*a*14"KS##UƝpsNyl4աa̳#LQT@mv)&88 P]#gsݍ̆^=osxW2w(:xR/I.oz5ht:Y1T" `K@;0(pl䎙J<Ä`ĉ xQYA:R`")XfǬQF L_*^Mn|oӏJ@g25E>βK*Ljd<)V2"""A #(򠳑2&"βurKiVQsVAMˮU+uC1xsIu%@˒igJ+G/[yo ̜.k妕E=be|4I!8:($QRa&lv^s \BF| U{0Mux#'-bJ#0f0cCp ;dXuY/y )qujP{Fgoho66(t =f=>'G,CO!cFY; ZDl ha['4'(xAc3Hm0q+ 6,{>ρMHV6=)$x=$]' hDhKEJVU/?ȸ̥yq~=M`Tix`QQ*c e2+p+$ŤŨzeǪʎ]+bFBĔxl %8rx+8 BDRBFwC!fH Y@SJx->cmx71rbK ԍ\< 9^Nq#&Iʍ#{l7n+NKDIDl7|\a +8L0&a EmI gZr9'M-H$C=y?A.:!,T6pQ݉a$rqt{rmv:`{>W6tz}Mw&p_2@T^t:߆E|wUݫ/?_oDVu\䳧sۥ;D1mw)CQ/j3 pj)m!Չ "!]._d̏'M l!茡 3eh(I-L,bgXQ![tl5Na#QV+,4ro#d鐏 +xj;EB2(:1 S띱Vc&ye4zl5xBZ"lk-k %OF9B Y䜘asS=7ZM,-ܐRd)&dc&(Y CVxI>frI7f,"#4Q MB]Lo}RVΊ.v5m\a-g77Xc,e!F3T5cљ7dAg;&Q2w 3^ͥSdf/k-#5N9 =ZifVFeI5ŀu+¥cBL1jveZY^j}Ccl03e@hDIe`Z)L'elyϝ%g2Sdv8is 0Eb.Swapk{ (ۋYyW  ΊN[lxgrTa5jX*ί$<]1~]~ IJ߾%Cey9WSo ]0e5GD+6 [q&$fP*&?wSܞ|MypJ5Sbn}{bmFZI w 3MCQvyfC(Xɒjdz9ɺSvVZ=dݬu*e%PHm0ri1I;wI#=Ut͠x`~q`?㧫?|ǫD]}뇫O߃$&ZGF{~=x4|iMM USSŶ7o3nkaż3U;]7Q;vDCU)?IN3Y$Etd㫯 淹N|\9DN橢3fn}B`݀@ݾ4)01H_=1[S F&eXLLj9+<$K}sgy-ǽJ?6}>D/~^2ld6񾅅o:G4Wx WHxOx#@-] owo~O>Kх~ ^ "N/:ߍ~c/?| G@s2(Pf+@ar۩r`xٽs &| `,* -o]D XpmfG`,n(v~.>ʿ.7b_r[MPjݨ?.[:E^]ى:p,gy܍Gt_KvdǙ7Kn"#Jьd3k,PV8d( Ǫ+] V0y+#_RdBPQ!kx sΪ9=JqTQܦ:2,wkkn#WEuJfKhuyHU*]Ғ-o*= Dɼ5-q3=@Q1juB֝sx ;i/__%Ip(RBEП)~Y`I|yx_ZƓIPsD-RC&_06 fF)! LQfD#iC lQQY5E TZA90rCTtr^U*,1 !6ȄIq}O$f.&7_B=ƿZ6Bb✣(iێ{듄1%;M2T4X^"`-ݐ8mjVҔ^m=I7nԗ`TX2I2jbbA-8-x+ +Yٕ>v5gS,vCdF%( }̳>yՂkK iCkPYJ AOXߺ<54MڡvC'P29M.d; 7BZzidbќz1 ,>יya}VNȰt܌W*8n(ՄB蓛ڟ{|S'Sv?k_(3Y3yVz 䞽1y;}V/5]1ÛdW/g%!C cBR ?VYIۘM+l1;auĆۊr{IFQ<߽柍?L*ʘ!8 G m0)8'0nBJ&?NGuʝ<ԇfzrb n\CޟOZ=yEe5GPKc&*;Hoa$ OhJ)L 5]I jIELEKŮ'Gh\*>D9a)@.'+&) Pa'36glRҪ<^7R_vl5>4탿8f: \XPRB^@[- )_dsa8E-pD +:!xaFٱ1,UYa<+(buTÒ`cD6Q a0:U!"BJ3q&'!VҢ6c}K;](% k/`H'ytHf=,b{K!jw*% =G)%qUwUEu)E!R+`;wUqWU\15ԈUJww䮪/]UqwJ{tUTbpW]m+XYJq)J{nrwU~pW]ae]Mm\=mZr׍ºⅩN0 տs~׫lqjy84/X_+䦫rTq=\fi4}wUJ3l1G7@)uI *JޡBW|I /]R]L\5_Rݗާr%^oo?ʹxuw9O JuolW* lgS(EX ЩM|hF|J1h%pBZ6IV!8tqd?mq8CBg0·bENd YGMLQ[GiYFRyzE`q&?[^z³M+WkK:v2vЦ"4f:'ה0}ZwQiq,Y;b&Tt@В=TA%MuVD2ƈA(>HӮ6,9N3IQRΈ/Iּ9HaeZenG7xAIJrI)w/.q}5P ׽1#ָHY(QJ'BU~6 PXSr9H9n4aCqNR|#48~kP9B $d#6XZ$+hr^TNc[i&Αi?B>YxX b 2{lYD-+Fnu61%z Y><yʁfph6 {b ۂofIw&RWFCʻ)CF3DǔR46>b~'՜2%G:_mk(vؠ&7S>uZc+PA'kAcEK )Sy "PPg}@"|OE Y&tlu췻?n9^*zOiS/',x߿:}4 _6`7e=xmvL؝==< I%-E:J 230SOga9}lnqh9l}|xz9ߓnruibg7U ZuXJ__I5WcPwO#Fxy%;md]hNu*,Zd]xEMok.ϟú;E"B8QXiMqyM@Z7Zٸ RHZ2x7) Gy;;,O>ߌiuF MZ \Xiڞ I()t M.1 Cf՞8Y `WO|9BSFI$}܎ Z6I$wN:z#t6tn㘇U~ xގtXMF-@Oΰ ]QYuE3ED+ 3$'3J&;XR=N!"h X<%0* EDmWJmXK> &GiCV}WoSeru A"gl`%Md4-.u9SJY:H!x *"%%I*&xe/5H֡o8dz9B;rQB^w:-릫#[#.xeQ3/B9Bs~ %K/g/}{7_sY˳uH<rh*]wGN7m솵`Ǖ'e W~8E=gׇ뭎[:;.ǵ}`]|ϷW{UsSmǻ;b36CV+2/C 7#!6bs'"GL/ְZDl'CH ]BJm$ĞmsƱD{O*x]spXcF9c-%&Q[L٣W ZAE(HV(3)cuM QT9TJ$ t@!M)+O!9/K/14%hA-/ȴ0f $Bs8цy ]?nvU&U&zQ*w cj,jI#(-ͣRem0ZIOQ~(vf9Bąc&dU^Iރ7B)"r&su]a@&fJ00.|{b0N{"elyŒ3_t; B#L9@gRP;6qh ǡ_%P0 VVtbw& -,IV&40$1g S *RC..L%HQ}5m1d^*\*S9cSV\Ϻ6 .AW ӹ)U&7<]ԏ0#Z~Mn|w99o.^4nC|FG7y=Tv!PKmg{}n+кZ[E6X#0V2 F0bAz:y]Srm괓Z]W(8md)9, G.}r +)"bXtܒc飊QkcA}psXwo^M޼;Dwo7Ip)Mt Lo v V?ym5 Mm4òoҮnkqEsQQ9=7ӭQw}oINsYEdWl`qOTTc*z0lDb;C`]\Mڍ{Ƌ>Ɲ>.xe-S04;G0i8;P1~NCN0J0T2 9dLRnᘸƱQYx1;ubu)B"qKM *0%F&.x Fa,Ap:E*oB )ĖmW-uwيxl|mRS 0[=MTﯼx*T_*/gAe = ZH10 .fd`[]F‰uXT.zkVڥC97A0ɵ `#9 O=cF@1gz4‚dI5;֜]qϠ|=z'8RzOdqpMiZM${Oo G>@J6®Z 酣竣:s>=֌]F\9|w"ï Ɨu߉ZLٍJI_T D hIJO gfweԖы6 U6tVqhMkpnSlX^L7W#PI64nޯRJO;V"wtwZ";`\ hxˏE]wYF^.G5zx9z1^~cvz4?,lb4}aTz sZD)Z2li%% lmPXXⱱ`ţa 㜷B :%u@V!js#3$rOinY}`T3Z S=QmGs0(udX91rbrtƑ%RHDcgOFae)%,-vΎ7Guh=ED𯫕`ԞSTH>@qty8®.ڈSH)%ؔx_j q6`3#ؾp'̌VbmhP0(A)IC9s U^D(*V-Dˆ^FcD2Y+냉KM ` Xyޖ2&"ݲ9&,q%x_bbԻjOF&󥹜ltn e3OB TLVke1R n EJgHYNcF14 62nTG $*GT KIw`*0#L9sJ;& .3wv`+gڧٳ|"R*&+,L/^6LJJ-T2ow1gus!:<^&{Yv%S}GvȢ͖YZV8glw &o.RۣQQ*^]΃Oί5ݡH67ZC*%*̘X?:TE5gӴ!L "a!Uϼ(z-ϣLØZ.rq6iKQS1MBSeT@|  XZeC%\ER#s|6ؽnDo0aX8p^ 㸰ؐFt S0c,\OO^Hg,_YZ4tCQ]Y|d8@sZvlvn= D匓i&. 6s s ?JFx*\>*a,Xk]kjCE?X{ ' `(? qfo$*Ű[)r-oF,FU"_R󷺀:Н;>TkRyv ^'ik 8n R9-{`:oA{[B}%x=)Q[M*Ffd j0DRmKȹ_2h@ثq2M$77`IߕUS&5 s< , -bu+a|߫aw D%@X_#CW@0^m뺸J2q(*Q+xUR,BqEV#JruqٺCW@0`U"W3UZfq+.Į JԊ[WJIŕb&}qsYesIzakVJ\O) +?)b4ԏz>LN/1)1/P: c9 |[[`&I_uU˽/=$P}| gƿ001 yN:!N;Ӝ]̹`1X9]Esض~*F|=S ӊM"iaYncΜI^r쬅Y dRKJ13gFh-D3X汌A>3eNh3}fg cZTsJ&H9%fEJ;rdGr!&ԉmF:܈1c^iGjW?g)Oc.*5X@Uvպ`8G <4g>TA㢸6 r%v910~'¨{=Ƥ٘Oh_vl}-XmV(ZlDCl-KJ8 ~2JYK i,@<+!bI7oٺ[؞de; e%L& v"! ̋, B=9fVsW(|Mn&x9֑aYqGkZ L9@B7$,mUa5iխ2Ս=vH|{qDkqdw:˺4o|"S׶9 ekuc3A2ȳaJ˙(1"" lD66aCa" j\Β@xkd+/F/ݺ^g49z[ӏ)j͞ ?]0bQG"`"RSFDE$a$EVd6RD[+^[.~ϮIh7w1>ޔNŇK0i^6LVگ{壇)rҽL.2̂,||%T8 y=SDc$R"!%K9 Q Ƒ6&) )Ue1a XUO )e\જ3_v/+;s ̌sɳ^>1cexd2OZ7Ӛ%jC$UzQRm={?`ْmC%n QCd)%)ϒn6`S%'HJ"ͿwIFtw VIh5d |3t\zW ֙̓{&=3~3mR2QusR}s}GCSu4X}k]SiAY'"ڰNՌT:<ʳyaRअWƸXcRΥ ,xTjj{5X͕1α1)l '&gˋ`96rDml꼬tro^T'JS+q"7NgJu(eX,3dREKցY{ʲϲǝ+*餝(BF5L@[ϔ%o6L pZcch-oSGC*}eqc*h%{>ø)М-I6R%Tȹ%z[v@/ #+leNv  8aSY"j>D1%$~3Ukϟ;:%ZAeL]>^%+$/k@K_r]+ϰ/T:74lVw,)34*6ΉDܙT@a6)]I+v.Mz/ZH򝵟LY6EgAJQIy[X#bh"IL.ULP(y*+񖔥$()g@2ɝA׮qdx::>_ > RryiZ ,斈k15[ ޤÝW6,uN !̨#AjL攜PH)@KF&D 3I巆U#c!$,!Y%1Jgl}쒤3jcas*$H]y$[LA3Ƴ E"e[$&¡APmf>:[p`b ,GdpV+IgdN$oݢr5efewK3ȦiM8fCJ#KJRĠmP&JtP]ykdۮ"?e).V7Eڪ5AK8P\z)/oChTYƁq fo{̤B[5Y9Ӽ׹}(pugQ`\@'@Q*~ ^&?( C`fHH}t'ҬFU1|΄h]" B9/ڳO7c;ќd̍"gd2r*( %x4DS0R}Ҫ(1Щ}E^9'D PEQ![T3AaS>б}9$_QH_#65Eo ȭlxբoҧx [Yn=:z"t֡z0tᐗY ?6PJ DLNIf-"lPAjjlsIIۀ$h*ݹM쌗0MO.i’[`"ЋX.j&yУPR>oyL-8MzȃQUiC&|_?OyZ`0m^䩞^=h{wpiRV7t^泻.V.0o⇈HHCM^-Xjen=z"H ,<b1YX"jjGs%^4Y`i05AbeM56S}]u:}Y>\J5ƁaLId(a,ӥE%hE*Y[V WV}1Yo{DǁB}ʓx.Е灣.NDhA%|C>B$ @A%GXJ;)+ȫlW!v&;Jcdl%SdT,;EePT. 梄 )%֦Fz;K֖u;#gÔcsz6eҢ(;vҵ^i.ɇ= ~v}]Aۋ^8кdH ,:sA[7^lـ춌m$Kpe >+.y*5HqQVJ,T #|%"v g}5+'Kȹ٦eeT% 'out7yVGw#ڧmut7jI=M;Q)aZ;:}{z$ô:bPŢů56" 5!5X65+äYB?Qc?k{mϝ㣮rc?BZg)PQ^'UyD&GAId7Ъ|E6*ƾiMwMk.8׌ly-K {Ÿ\QF(9@J{URAddF"b:i egз7X*ԫ.X5YZUhQ'cq:RgY"gAF*1;A]q,amaZX7ƬW1[ﭔ2C!\Q4hISXk -z߽Oxr!bƨTӝ(BƐ" XE͆l*.ԭ ^<;!)z4W< VN\<@c-NO\G qH+߱5W=#D0OOӳ\YfM2̏샏MUƧ>o_3߳׏\WG]/M*y54&c׏A@=K)@(pphRB?p`;L6ufrIj/ GEy8 a]#r0:^~_~H+zjiW^NNN/UKi|WK~7\<$bjgs+>/Ԕsо\_u]qzɇoVub"/9񟏆lm-v堈Ymd^,qFk # #ۇquUfy˂p4QuN>Y,x9dmUQlm)\.eBFoX}>fUՁRh渽CZPDCYQJXBD譖:xفr E t H!-AZ`%dXxMIŹdp6K^  Qv5$Ax(B!x ypkf _\tr~:Ƨn żθ)NG{6Z*/m%^E%p{hZoo_CwW\8=zx>$*0OаI?̓s"E= Z*(BLm9Q D(mrz.mU;Wþﭺio[_)4?>>۳mh46RuC{[qޣmh($D?g,8cRn^>rs_;r̀Ƹ#_MRsָ=s~ۢ'iPu<%N&E%67< A`#!eɕJ8%f6gܲ =L(J>&),]b ,("},,e';[BT%4xg$"h\ϣF1"EfBLsNx.$G}7W)azdlMRu'8 I9Y$2(1om)O6ƛ.&fYBKYVNy&PYq& ɖ\:LFU@1qc+L&D/"N^3v$h)^xoRSF%iJIT!ĴVsj ]*ȷ﫛s<(%i!SBă\0p#R `x&& [,߃EMumbQLmv'}?GHs.ڙveyetzᚧ0 㛎oNov|7A2><f10}%#ݭf-lDp ExjxɎ//kq ccuǡkc'f00OkyycZwݽJl;y+Fgmasm' " %d\.F5qISrL[ŷmE1$e$" gty7LWՑIh# BcxKmp;hYl6<2Im2iL j[m_[:6a7m{^)f&h~@NYYg"J}틾#<3|5(jܝ8y N F< JIm4DQ%r >Xɩ[p ))@v"&DLjqrw{*T} )X6:iBzݹڗtW;ګ3hΓW<SgL5(/?db[g^?_h2Q&^W]©㥟ptp{t"%~U ?ZKqF%8A)S#ey(k>7%rfQN~q~֓/>x!C7PKBC?ڦ^6#K3ɳiInAlh^~n.ODe{痺Jt釚5Eh0nK{~7F öUg=\hFjN ryk7?^#گ'"x2o=T_5 *%@WXDeE=6J.(HAAăE@'.NRZ*jJD c1 Vi52;Gs`ulMl&sh.P魧&Zc\H'Fꧬd$Sa98Or< lO]G 9#'d>} _U9ݬc/y<t&ZOZ%>x JI*b5(Z0ESB #u(l⩲+,,Q\4D9 J#3f$SRQ$gn1ͱCGτ$' >\pq}ֶR %𢚌nK$D)"'Z>Y&@v G֑]fs4k)B]|Oޔrd| `u.KU3sïUυp @ww=?>^AoY4Nr-هu6 ?rd:Tp7jaB|Bk'"e;.Xur*@M&Gr!c#*p-VEKqX "` P$I֌I8 S!\g}v5_Is\S4z,IaS˼:U}Ü[*\DΙ acjYĵ,rb48=\]OvM #K^  Q2\@Eg!Ae8mD9%(|&ʡ^mqjƉCj2}(M!- <[' /'r): uu-Tw;Z=Ćä|k{4k>Rp]T/DܠJx ^:IbQ2H-qDsIp*}7F0KQ *+Ń$gRjR[IAjE͵g$/+1#e}K4Lʝӕ.?Ϯcof矃/X\;JkQ[\uhL-LweSc\IoNSE$ߧύ23o?6q͟M.;)!gD=P&k3jP)([rv3㍻?OPޜk[E"\Y[GwUNjOɵ\g3E|WQL2wB = Rw>n$,̾!< O{+%~ZNpi^2k(+Bb#;{Mlf_ksݯ͝7&W׋11[.g ]/v> "\j&u+&:^ B֍$ڑ\=ͺaa [;by`X(tctHX/= !fpԣ$, 'S0j `'bp MIc>Y$,,p Á'#9x߷[vf%u[ ju?d=|XdJ"EtDJG~鯷wvPD`r"edcbR鐼Kݐ +HRԝ::fjt)Bbj.NzQ"  `;9AF0f+8W\]ZI%&ZTĞU\ ڨ"L@;d*X3k3^Rv3٠Վ2k5I*!zY0eR@cQ 0N$[UҰفqo`_{Ȯ}x+o^f?50ͳϻ-略?HsI=KOCћyV})W:y$ ]`u\?x5mq FLwje{?ĥ/ף"iwn=Fd#&jwAf{ |-98a8#|ȃiqVZWyweoݜtA .[1j7^<>7qW?~PuE4JРw֏(Dzh6ZvVeQa7Qw>6.5|ק?\/\Ek:x(i"{8N{tU ]1\֫hU*J=_"])&ۣ G]1\ WIS .>Uϴ^Ek; VD];6}'*'NWt:KWBř^Rnѕ}]فmzB]1`p ]1Z-l骢n +iS+Q@U+/tU"t*ʮ: ])mA?tU ]UVt*J]] ]d 2`'uoU/tUj:]Uf]]"]iR Wc_誢Ϋف.`)Ic J?|XqkƐTG4]+\UhH}-93@9ElQo18RX$ؔ\@Uv+Gqc. 2W CMuV+물BiՍ/~*~E*k6Gh寑 kqIId&ӾIwW/&^tW՝ ;86qe*m||s6H\) bYDJP%/;Y>yt5h)H&GK~c! |ށ5o5;Yyr!cq C ?{%*$eb։9/Æi-jϊV}fILFQ߽5R2BnT}Rt -l vӒaj ْR A= Q* $SOIT\D90hImfl vfܬR…}[B9pQQ}/5\~5ه5[t=|$ޏ&_ƏF]O8c'8i J&,BY0c .kr%` 7[TA~ I @dK'DљsRۥۂMZr.ڭ}Y[z`An^i6@=dVEG-sM"%E‘9N"S-akpԩ+1vƧoˈeF#Y9mb,dQJ!HM6`UeOy k6k&z,m[Z-F4Z}mJQ[#{J0SDݴ4;6l6;!hH$;S"N-%M;U(߯9̖4Q1Ґ)ؠQ!D !IE(\Qv|+o4︙M :VF>X';ErȠJd럂)fNdbhM% xuCQRDBC29Ih9pG_&5vt/i%1{-x62Y-boC@u ǒ&fᵥ\|dkAֱSC"$H-'sE'>8Հrov~ Ȯvd;J#[LXԡ D$ ˺3+o 6!XM>3'23FеQuj7py#)U.fgF Bu}ͫ.r*VTGp~C shTBCP٫&(B4?dϡ)YA3?~7yivcsB'JBUt B /{CZaT)|"?tuMF{`h)*Y<fkp̉uɓ{'y]e"u[כOHγjMw'vꔐ>apA+-ѓ!:WDq CT,U! 6r7v0Vwlvv̟f/YTl-ᘁ Zۺ3lvǜ)I7&ݏڂC!(as}6hl.J96B0x1Þ^1)l/,;x6bm򞥑KQX6^`LAc2BDXvW&k R)r>'Q4c+ֹ` 8"E` @^yR:fVmQSCV^[&Z ܾ&Lj~/>٦Ĥ%&Rh;TҌ¿QHE>45A1)"L,:!cMmnZ)SQ3Զn|ajf_b_>O&p5?tͽm&+QAEE]0RE%JSx 2ң̞TJق/µE]Q {sw6g}~ϒ>iW^}6~WlimMOA3r]b!V|T $UJ훘n[15_NNPbËp/tdcy{pti7-|~t_&z̜xdHwK׼Yۻ]C/݌w~Ś2u\Ǒpp<;JhC k_ai@y3*j28x[ vS1Z[I0Xe蕀 llܚ|Z5{X1ʊcao>JMƾ5iDG̶+:UN)\STMtT{%O)~0[}n!^74mP`nn Je쏔e:e[ҩ:SubI6h@'Yz )Hsy$Ça ,;! %xP-)@@G3T:y< Z8-ENFElkpvFrrYM[z7{NP[=y&-PEs@t!bz&h+.~@)`+c҆^gd7\JZ3&_Ȁ.&x܃+s>ol5eTnOWZ`_5/n|!(rZ[s1ZbT8C%ZrM;xcz]ܔ9؊-?\gjβ:b:RC|c0㠕&H l)aVklTF tz \G HXG"\J =&y$0Qc(,<uѧ -1FN#)mxHt\[kQT%iԀff.@huӓi^6iMW@*,rrCwr Lkusɹ\%٪I9Yr׽亁f>;h*#R"%1,yfEwZI%``!AyZ;A]c!P;BYX~yv[k7>hKA1PƬ5RDh'09|@a/]nI;[S* A=(;1VLM5GҰԤ JGPͣRBZ'GCksxp8K|M>wB1R"^vx9թzȱkt9i4ъІK+a ó*$Oٵ1Uog~-~+)W΃P)sawz|^o@z.Dž?M =]uCQya ч^QŴ{ݫ G>;)Y+A{ ׺JEٌt&k$uA|W҉ |~yqKF]ҨX6镏38/^ugӟN>yqpM6&p<}׸kp[]CbMmbx~uC~g0h;n Oٻzȥ$nWtMOtePtHA3? Ӎ(sWLE)26LQ+q YF}b=wՃ?,!3=ϯ=ߕ$t4X80aO raκ=~ȉR4gKr ur+-0y+# jQΐ5,UF5sc{ KPʍ"6ԑaƸ#aREbS#KƶkOFcZhr^;J|i-uҿ.灠gH}a9Uywu>3M9 B`ЖC{ضH`fkCF`PDJqLUNʙɀ"GyT #z!Uxd!R&R/5eDDЦFQ4`X[ʘt5r:, q`~KI1\s ;_+"nl E\w-BPƾڭ94> q;؊-ehMeZ ,SbB0pW֍HIli5ịUh4n&PfT<ōhAD刊`) NP `iyFg=d)Y̝>uH,}b̧UoW,,(G^@0qJA0zݽܽXљ81{V&tNIlt~9.|[/sݴd {<$^Й]kE5kZ9hm"M /6Qخ+DE-UHp2z=D 5AWt]؋b q-Mܿ\p464*~ ȿļ&(DG=u1Q6H$8iAv猢+\!˿pyLTI}~ +ha8{].kz;h6l9m,7v2[l#le'-§ )^ Yd~U,?guޏRe  /Wn*vW1? Z(-h Dc2c"aV^7tT(W qG0L!Ss{? +eo M}D^=DP76G/ˇ:MjGcŤռF\MH5?p?" j [ǔ8wH`Dr|mj-d=<Z"5T*]< ̰8n3cSM^,хXfz.8-x:?mwo?ITjH&mႅ^` ^O6&Xoo ɢ++N7贗;ka$g\Kr11ULmcŀct9V'Geưb`=%r DasD圕s{z:~OӜDuȕ|_UVZ*TuՕͲ g\|Ur^9PirHNX 'y'{09wxI@  c羽.;%$eBܳy>`tcC[vM(}J?Yv,ƥqq؜;PtT4Qxy6 *d UtՊj=yL-1-$s$aۘu2OOfQ:н8uK[K=&bea}1%Yr@0do,D.b%j%*!)ZrH1nT7oݯQX!c*^q] -kXlJa^iGjd^ wE%KXSƢFA-vszAVSbn+0!R^ 4B2.<~Q_lFyTP XZeC9\H@$.PΡ tz*{;C?=2F*y3tC .|hZ.LRAg AHRVa S띱Vc&ye4zl5BZ" A/$jA۟`On7Ĺ9 ?JFx*\9@`uA"y* niaE5s!,I(aaR"Z9h<,i_5Ôp|iڅȺ9f£EJhuk(q1[M` xx5wTIoym"qqlRlzt@MD{qc KѣVLD# bPz YI =,:oa̷h-JB?{r ,R}3pه,rpb AP}ˤl,߷zxEfI1`ٚi6kzYrA'21;/"hs4)j%r3)ښ95nr]X3ՅPY^T.)*6}($n Mv8 n'6їR0;ؑ$NL8F5I^$żL[wIY}82nJ+32`MM5D#\4=Fp !eݟ95 q];ڪVN-nU nc`a3Yʒ Jkh3\SV`Vq !LtML,h #(>fY :˩9akԏ>6;cc5M{XCY9ˉ ٚH6\F"Nld'p%~aۛvKZ%MK byd>*jMub{;뙸?$P N듅?{)W6ǿO2}c $͙0T^:*0/zUW?YGn;TIɄb( .dd"2O -uRpU]@c2< dEFr*.V(03ύ R RdATvuT#g?"8.uGI?oF tYxsV=dKoA8:䑁NȃsH]Ddxb$*"j oᝣ..)d̟uunA- ko3,Of 7i"5q: V^2Ӏfq*ۘ(͘gLIOZ}EK}z M&*åD$hb`ya:'։lU%# (Ņ t"ˊhƹ \g@YkC+ )N@k"g?F$ <BGtZ,-tT97:_ICZZ|3|B1YuIURga_O_f-E8mFG3!ԻO6YQu7ٹS4ݑ (bkFE!Qx V^\,T}ӛ8c!O_eOw|tl˨5H@PR0h3mDZ V&n#lf3sұכ9YݭŜ5w?]n%D~ ՜iNڍCJ|֍k⳺(0+^n:)7';F)|/ BrHw?OLLyVhi'[( s89߽dn^0{`Yֽn V/g'P//xƴݽjl;e+G7maO , DVmk"A/d>i庤\ས$8C%q$4<we'Vtr2O;fB\xsbӰdϜǜI p?<{|Ig/`,|~ޓwÄw>cJj w|b;a;7N]Y"j}V3f W7ށݠIp:8U330Z+e\LsI܄D­,z@)XB*C&k>MRխo[ՍuR{]y]5Kڿ C/jowFUo9t8K ؄һN֢р\RI>DQQs)C))J*HubG^ !XM& ;GϽnJ(\䶴@y>r\69REtD>0 KiKQ\V#hV/w)~O 0<w[Ѿ@[dׯ혹 [As 1=^^ S?bcJ987>bW.x6n6wЕ?/.)}_C''I_K'qz:jE+iJrR$ʷ |9 m]()@Uto·>9t?g\/ hku\==ׇ/Kj]_w~m.`w*x8)?`v0DRa@cM}\|yz~@8x=2ǃ9\wzd4 \of,JXL$Ts|@Etb tv&NI)aR$ި*+L )(["X[)r^ԟ^\fd-^iod*ܤ R4h\)!&$,6(hUrP8%zDCV雋x) csrLz +*VxP0q(<0:ȳ$D sQ3e$.[$[i!bsi6/on`)b}܍/7i8Ҹ?~!Sk~0xiZVgm~n{q82\h&s>Y/=f0cdN_&~;U(VEiC>kBs+!Յ "PeCYիyUiV4x1KSp d27RT$ҏYhXY'r4,s,NU! D*`slbݸeUYoKD pc@H\d39fN"Acv[7]99:mb2K pR3ˠkh(\ǏJR9B$NP sW5Y(QBIGIG4Vzq!U-D``)H"*e:wb"Ι+2Zpk%FG2Bd+}"ۡ= } w}|_\[,g2_UwC@=˜&2'ALT mddmY\eXwCs?}}R +(ν]GVf >Lofcjuv_&)m^F9͒\IZ/Y7(5d0QSp\eo=:]zhc69Т˞4HA؜# Z" ]9%x#bKoVbyW=+ztM1 'nW,KcPY^)Bo8kWtxV/TKV8-V*dpcCSBŒ u(WՅ w):֌yYoZs9C7?DtDdi>*j-#i-=.RZuWy}Ծ{ky6#>MkK >,ڄ,I-5%ς eAyţ |PQW6Yեkz-X:at"SJ6r@ց tI0$ IH?}i`yTW^>Jm؛wqjFr2R<"i:}i}U)q"M2]=eP=of2ϻ.ŝՕIbKY&aykdqO# s]{o#7* mMCv .,8 0cK^Igݭe$[-Y뉻ifU,0J('^h9Ԏ]ݝ'(OFhQ ; ('' w }^ݞ]L>8! bcRԫ+vb7"WJOVpi F_+.QJ;G{TUZZy_vrY?x9p9#~UZ[]8(­6vR켻^[AHH4$ꑮa8f`X(\YWwӅ_'>㨌u:ɦQ %{83Ka\q;;cR3r`i wrYe:S8)TK lf i.uZkXLJ$"-Pe *(ѣcmwZB`i%[›Imp6Ix  ^&K#S rlqlGr/ű=Sٱ;g*%_c;2' Y)MUa2F6':ft!aVPR\šeq8,hHD[+F%)JHe{k jS竕<_({kņJ40pV>ꞃ/ ?@8n*Cѝ=Y´;1ΙS8,$ 6!3;ͽ%{W믊EK)J6{0Q]hrz\oLt> 9X-3qY(f婉 ܸvn>jN͕.ʦ/%}IK9վVYet|s|CH\9P40  Ak$gJZ\vT}e3_1fǘǘKuǡ |ǔ1ŁbmnW^ګ U&Rۆ}@lDmېqFx?I'I-wn־k-_ExCj6-x4AYZ},^qV' 7 }oN7 ,IQA?5,Pjyn ۄ혷1{mbeY=^ٻ5Mq=`ţNPiAYg"@'z*WnCSl٦rQwPԸqtp+NO!Y/RRDQ%͵h$Gdni3^HIͮ߭zM<399&1hmo<1MŢci*vqo6:j%K!?t\^ &tKQuC5SċJUZTGwUP'RB7V+RՅ "&QYP"2E8)y=pZ~D(iIJ2 ̪H ¸\ &XVA!~N9 l7e1Hk*=xj"D2X@ybXO-'X))= mұ0d~ ן|=z!g1]w}hOKx9G Gz>NALg+<DQyKEN}ҹ`*J Q♊cyO,0Vs< IaSN5Q95s<9I(>*&0B;=I5s8 j$b4Hq91BIh˸5r6iUi@ʋۀfM48`m6*x';%AD2CB!1q y圂XMx: F_?Qҍ2H"8 H*V jF#Res|?i>".b,d:-8$Oc-_9y4KK0%bR.QNhǕg.hf7:;"Ū$֑"?U+>۲ 5+0;l8<SBXo?Kʏriiǿ\ 1I#u!"Z^8l(aXe[uܹ™-2NG;Ϭ[BP`bМPF$$f! "z5= yG+sPwX`@5`i@#CTH"'+L.Jw xwMi48;/69\dǫ7Z8+QlЌo;f}oN[oaE2>e>a4Hn]T[]b|d.vny}m18~4̴gU}Ѽѳɼއ3Z| p^} ߂gULA*U©tzTT0 P\L%SFZZD*R!@GhZZ`#絟馍-+IZYNm6\^Fp S'"8bGi!8MIh5p1rpk˥h3A%L&V" L"&N &4pTBi,Z#5eMZFeY_܌_)s֨ח:/'ujPef{((q[PosCLQsK=bDž\'Ht7l4pb01֍hlk7*3΂YV6raM魱Jd<3)Z5c+nFض3ԷEf^hęҐ޽X)f,DzD=N/ז ăVu."QQ䨌 Ez=v8X]<@- Q2 <W21:iC#ZExEThM{Ekfqs />·|U#쵔-\qEASxɥ)x$ E[ IrBo.z1˵.PVi-yBQCɀr8-2BT\fBm< ˏUQX`i!gv8BKn~pgmF8HC|VoCQHCQWu֝!k5aZE)L,٘J#4j,A/c82CҲAF<|ēv.a@='h.aF(\TLy yHZԬ 2?gV0MT0}o?Y?vna5s%BiKLH]aj~Fo.sD mc;ԝ|7ɌJHV[zu2uK)~Vik-bkxҔ8ϝ=L_aF >}zjr1qv  d.uo)mYv%P6x5*O/`A~ufHu`h=Mk2'JG!3I>o.0{ɶiU vMA۫Ҍ/IZDGH,}~}^D:9"~B3fB豼f[.Lw|ot>=7gٛg?}zO: ! @~4h74hr347]ʡZ c; /r:wtx9ףON:-OM{z:a+6]IVbźy.6_ʟ hC< ٸY#VG R:^CJ<$dJrnS\fdI&\Pr'RMNH)NsGGDcz:4'A-$4$8숏A%K@z"UT@1:9m$T'W=ٷzh`1X a‚;)bI# IE}ӥ^ ^z[QN~Yk6LKl1^IM@ᕍD9nTA'a "OM ϽKܿ].}Eg=Z :~[-6x`)ӓ^+z6xmXirCO6qj4׊B_΅}Uy2;=GȒIEHZZ@uΣW"hQf FМT$"4 }%PneLiNF)оMsqTPtn C<1>yv^ges>EuhT+PӪ4\l.$Gw{*JioE@ &hOCQjBHA=*'zP!GAĭA,!ۄJi:kVAtacX]օՅO F Ord3JӇ;NE-oJ?j {%ʄv"98@j! 3Qh\h wqfi8!6P`Ym2NaDnd1WhMsvAXq.Ek7ڢa-ZZ[}'c3e)CC%PCV[ $tRY =d G/ڳPׄ^Hh8\ օ$xt󝚆acܯ+>W{!fx4>}SXֈՈF 3!Y# Dp<0ˆjRcdM1!aF5%Iq$aAqqyͣ(Ikɬ"Q# kй_#B4]I%G-{XݼVySNevpQV%-jKT*rG4f 3;ckF,9vw8ܛ0g{#qR\Ch}U&AsF`MSU[MM\go Ρue ۵o~cx PpA,MJvvEA(ZS+܊61uҭY]U{u쨒fGIu JQF_K硜cq"QHCT,N0Ou@ G1@5 qFAs18lh 锑LoL*FkyХo`\eQ/fzBjlJΎ{»/\ɭC%V͆_tE4Oƫovv1}{HAyϢws۩Y-ofqe˗_|ia7CY=;]v[,<a匷v5۟k;;Yv}=Iz*ϭkm֟$1 {rЮjk|!cSf#ȍ$FhŐ(O` N tKP<HRB*p`tOo+#rRg̸g-z+2jj 8|Y̺B<8_/Sc ut+(87fSfD -4 S/+Isף"4S8L׵SQV7U{^|m3[z "6J3JJ έ!9x()  RɬV4Sx{㾁ZzSLZFn֠kϏFeFU o5d-y@Xo4Vt.(a7Z[po'=֮{]'+%;%-%Nj$g4@B5Y4 Cȃw:ș᷻ITPB3cc$^IHDGr.<O1Zks P%D~5ѯZ蘢_qѯ-mxQ"gvH+o HnfI@͇`kAC\~it*kQ-+N e(8#60q{(!Ώ:]rD(w NcYҲ!Z~ 2j-3N{e]P8dc]()9ӞSE11]6 T&ԦZN,-mmӰ:;h<9^('q3$nyu3>jr1}g9M οץT)*fǣ1.1L%g8ouĠލez6@ ig Xc[V[j=)ZKfQЌhosc/o?g/Vmգ}i9i_z,9_PE e1+K鲘Uػ,f, i,~])VMͅlZŔ#xdA O@3-SSȃB B^l$LHP!U~f! j $$m(hI~ Z<@.dG{^t MRVGB)3]lQHQ襸 I@hz&Ic[p1VO] `x;Al6G?v8rSKɂ0LԞB40#JAQS.*<?>|:{{˻Oo|ǻ7o?N>= :? &qu(C@>3h)oi hgir5poTC󋳵 >w>^]_zϣoduΑ/OM{z:a+647+;~DկX..ReTFq@]\m_=d7YV#Ӊq͖ͩbR@tE2J̍7P.1c⎢:r x0"LxeO-/> R:^CJ<$dJrnS\fdI&\Pr,MNH)NsGGDcz44'mL-$4$8숏A%K@z"UT@1n:9m$T'W=ٷzh`1X a‚;)bI#eIEy"^[0zkgzuzlY|oblf1| WB74EI#E(EAE}NHWϟlYqp9&G~E涌A}᏿qGy97H 77c 7 TsLas.]LqlG5Fi8dFѓK#jS_4f7C4rIT6lmΖvvF!L'{ t%^,Ebq{Hi܊:쉺Ζc,E!;X U=mX'EF(JA"=Hȵ쫶*r vq2qv#,Yp2 qb,3>*^<|yl]/Y,umY9n =>>hy#v5ԍH#[t9TG)Ԣ;uwS4ml mջ`?`3LX٧`3ym_ZD@]BcÚ\!߼2=#X]1z^'g7vTƃҏ}'FDqFĻ4=#th qRcX*I9!mc MI,|1@*grEQ#Ф͵ ʀeԠ'g7"n9GSIʌ%3op/{۟uId%!T`9%TR5'=80'EzQ.\i]`ZHWǹͩ.gj +KBN&n&exvzXC_ng rUBNxwp*sCOMĺCf;JR!*yC1Om~ N3#Ux6Iwvq܊ެbl9Nl# OUv~;7`z,߮&d9ٗ&b$ญ!@r~b Frً]d "kdAըQűi2Qo9Q%bTR>mR0 ^k I[u=lSԱb,ȐDt3ySG/M%nhп@-fPWQW꩙3璐Onj}A5$3|&Tq>qcx)e㐒\} aSyħ_NݹZ-)yyqQ Gb$j\ dY䈴{wTF7~$o|}~=lO=tJ{R1f擤S스^ڠ2tsqҽ>J&zرk*}S :Q.M08Lq,l!m(vpiG3vwglѨPVⳣ~t/[hSmd#:f%0E%*=Zn%5\hA$^{QaZ  (ԸHpU0< fN{k: M#-Kξ3gCK& -95B-)(0reiL|_h{O[a1-ٷ &X`ںrR-qk0PbdX|=*B1ĩ+_1𜞝3 =3jR{W^BDEHע2JT{Z87Sgu-CH?oo9^^ͬ7nء&T!n5w<"wkJ'Yy6ی:bSAv R4gww h ?@ȉ > xGin#D0R$* Z'.j}o-gH4RrlP6/rU|I`鯐}q!:k{fٮ{ޕ.>ݨ8M)Ga@n!zhd<^O/ї YuJ([hjzHa֦xOq 5S;^tI8/\=3-ܠ#C)/嗛]/5޽\]7ndPF'BPj!XXĞ8';׌޹D{P9jLDkׂL#&9p :AKM1flؘ`:.<Π7 ,b?:l͓ƾٚ݁T`UQN%C\!T@ZهX5|ℌiQ޵JO*q6_ nKџ_qk-˻< |Z_|{z\%M$13_}&J6yLWo/|ENBH,D, R'T >&sIrppH^l8sζ;8{ΖjYkTF,!5(w$&(4JR)R)*+lJ)pʘLp&yPl8V=L;'圏?/GOGaF:gr#eHrTupYi["H.]^f+VuK-6{{) 쒹ahvнX%Hdt<4UEϭnLj+r Q|d6ً\ 6k8`V1g)ب z~&Ĺe)!Y{b\ K1Ovo FjOt),f #B!ǔ'5C*)jaW4M{KP]jDG9,Lk1kng )I˨z=,`ⱷX(/9tͤۨ;9v״o.zF*^mրq˪G)jy@cdPT. H{`{ ;wL?ZnH{V?8Yx{Im_/E%s :ՄMzf13bz!Ԙ$ߘ/5sY3j 6%2oG֜/5QUή}d`Cncw`?[$ M)-zqRkf)Zl2 6֍CgMASViW|S PE.?dW߶R1}cf"C ٘Tl)6@m&M^+Mc8fM^1:,X\KVuĕi|"h0eƄ~.CeS萲wneb;ln{`i!nSe"0lͪbڻp/"bq6IC=D$NdZh@?!ŽQw#1{G=:Үw %i/l`iy6LY#IΘ `?Y sAF[EaUصܑ|բs`8-%ElNd dp v~dJ<, bD#&tGA[@bQQq[m諍fAmBzqMdeNzfO@&]K(%*Pl ѻ9tk*rU!fdZHVaQIGtoKљk:jեfn'Cپ5FŨU ۄАrJa-`ݫ7ȗߌ[03ev*9F͠bg m;1Ej4g)SCi/K!JG#UMͶ J&⊲NqТYC uW\sl CbWelҴ:KsvƢ}1 (2nwVArx]})ۄ(&)޵q,ٿBf~?yHAl1~ED""ػg(4d#1*-r4]]}NO.p :Zk;r EsTeL& lhBA˃h`Õ1X`P*fP_RDi2#k8vL!tr܌^ !Rs&!{r%Ō 3M!̃V;@ !XDB`g"H}iUkI#s!4s8bF`&|AT$UT-TٙVNecXD6lƭ 2q#X$} $TR2НQ3!pi7h,X \Q{ʏRV Bj2ZKK su!)IE>/)J5ʴLbgU+B UQQq@؀QV IS[0,pZxZk])4a8tj 7V֮{1/Yf#8I|1@TRXjj90cy7ΰ]tZ>\޷YolSǻuMp@*E98kh#:c<5 s.pYTe! cH>K45ۜߪVzacy]!gѝ4YF%4PojP@Y\Wo^`?bZcZz7*-H 3. yњt(2 cŐ%`&k1Ҙ[@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H @r@RHgwG TśQZV*OJR!@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ2KJ l(`%%^ +$%STu%H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@z:J[RE[vͬn5Om/ۛW~NE~=D VrwI̵lgKֹG/\R0N¥ \hՠK;Wt;W\wF=vVneWO~`v`W0K+pU5U \)vWNh!h Ww4<,aV>2Rp_Kp9Cϝ\-\ \UsZZ+᫄p  v \Uk{pUt •R(Cp6;W\v`bURp'W)/v)3pU͵;]Ukxpf֮$\ylm;\ Ҫ y;٘W7C\jQֶnK\jۛh0}c3yד4718 牐g5osf%;"ƂjѻZ_1T+!~0mR[!ΗY7E8n,-͝gpT\ͳͮx~_|}8jҾy7]y0Ӱ谶k._0Kuᚯ'ߧi=WW7uǭoyۢJMmTb:UFD=Zm"n6_,i~opwHYg/ҽF zK=G5˳>SQ NSqxa[D)FL"%YnժIΉǎ o4_v/zd5RبMCnzTrhsJχ%9K7ܗz)FsݙMSq ya]zCoIJ̦j~pUz@pts'^wGlIz+pk㇫j]OW[[4uT:~l8*mT6'֨V˂q_,<ܠTe&c}`Z稲䒻 .b\=4Z6l֑t!Oh}k0K?Nj ?9?5!#7iNb3`:g#b~mX||./c'agk}G Y2؟!9~欏=gͻ[<)ˀ9/긬 `XOET9!ՃBخlNh\f7‡jsŦWg3dΛV|I??'~ ˣ}Tmy?=2cwಭ5ptpqfΧgyc|l{滝_}"W&wu?,ΧUoJozx7/}|Xi,}+EPkrt;/4JH;1&+Ȥ/)mdJc1}͜[ൟ?[*%SpyFMݎ_m+_}5k?Kw [ކ@Xv%"%LAUYRj6IGͳ})&E튳b=tYg0s(8n{'Uj1MVГ6ڲ5psP*P꺄kǴA eK? ܏84j3ƶ(deV$cd֞ZDqΝ2faFyL5K-"Xu[h";\Lg1е >~q4j=R@ׅ@ޕdgI&_5NiHu f17S{yIuyd]&j;8g\2wZ\Xxaoc߬VgGCzJͭmb֕nz3@LJX̺-'ir+WŐˇidKo.~ZݯCGM}^1K M^G;׏vV!nR~;};L'u?vu}|yvrv)kcN=;VMn-f?W {qf=.ocC-]ے׷t2Ԍn?,f&b.yQڋ5g.:zisz2]p*ͭ^!ſ8|sD%!0 Пy/~\i?ߺiWiwѴt6M1zi׏ Tvrck7|;YvHs|ԗN-uPIi~>%nYUrĢ6/BrKօ;)8Ft9'u'J Q\S3L9[MyN8ݴśNjd{ v^}ժ(?+Jf]Yo#9+B-0Y}4`zfKAݥݒ\.bSGI.ݢt9ꀒJE2Ɓy9߶:ar]BgBWoۨS,:BOi5Y r( 0h.b>;pLFszGܴG<$v^A׃#,k鈖[de8 <&9pf"7 I6YƳٳ@!$M\[c#ĚcY ,tyFe&N @;(XXO^m}Ux+~`Ϙ2+œk[S W =BΊs,[uaV7 zG:/RjXR QV{#E1TqA:yd yF&,7ExWC#)L=6ױ٩ӯ+6Y8q0ϕJ\dm2#J/dcA) Sd"R}{|=uωLJ'3i (TT)HdI#mт2uց[9)f(ܐRdOk^)cR@zcd{*n@#X\|YM*>۳(BӸoǥcWbceS?is[T穴&Vr~mϝ#vXڛok9r)$PP .(ܣӥwN "q6#fVB֍s%=)A`HrV&4V!HFjG,E8 Ue,=^ gff8YʓIw<Mo7wH'L&DlHdiIE'|lR,0uc̊$6(0 %KF&ǔYvL]-q#6܎pŸ jW[ںGr fN ; "nf KYEAujecƌ7!͵iB&fYQ&KXE#pqL Y#dTҡ85K kSVFD#bV0"w4*Vl,hg!:L2b$KLK,WdWFjG'E묶JNE*qNC./£df G21MBѡwܦzmYgYiK vk}lJ{S6$tsJ4:oR)#+Z) 3-l¾kR.i {B\Mᵗ657;zw =_KiyPV2qh3Q)LО&'SZE?W-Lܵ0dX'^3J52z 4DUu2Ul*!RNxNX)JJF2nk0J!֮YMĊ0q|[50OqЅf79ݤ<+%u">1Ȕy`Y9+Y LTJI}*Eo%sd^1Rr!+o坳.iZdK=MUvŮ3o{e$+hrվy]8$L2L4^p)*m[1}nm4c 8IFKR@cdv`LN1cv:Gu3lBf ;š<ޗ`ҔRͼ͗ʽޤ{}--|(f)TbAլT-d0>J!]4k@+Ye~)ɺM6O*U|9ObNfWNK0A5: ^`DyHLܲn'1H4FJXJYrBhTd+Jd"+`2u5q KCc)u_yЏ vr-6xsEvM4ۻ7}-ٜ./Y*2bC iHkB1'čJ&bpm, ="\Z/.CY;lt,F%0 68'!hY p&[ןV9B@v;[|j ;}UuxnK4bmG[$eh},a1Z8H IFt$m"BRiZh4kaO'[G y)cʂE/p2oLЄ/.Iq:ia9zS y ]qX'Mz7q<ԺTaHQeZ2 d= !p`iZ_M;ouZlCG P>"E-/ ,R / <-B3{7KYcxiB0Vb)M9'Mi3buJY0^ Ɵ[t*.8BanHk#Ut PvL̄EoD;$o+ߝ&{>rOXwsU /,Ҽdr ڿ;]ތa;c̖ը.`wЂpk.\#E痧YknY2 v3J̞ OBȒI/XJ2q9'Vቓk\snwn!*k'JePăFZri2a#! ^Y1h&izו&ƩXG5兝ǪWP* ]\/ϥE\s?dsPP;x)ގ;Oh_z»!?>k}3}?%a|Kj8)s/.RlΓr51MI/Z $Yѭ[8TMy,ݟ ?ACD()HH6i|zn͓lE][2mP-6s0yQ[1_V4XS }gz?Mfয়6y*ǡ<Ѡ[hʢKox)j=F<W/%WziM~LǏb'147ښ*dJ *7F(R2 J"<<+s";TDv̉66kF&1 r> RoIy]7 H_d&Tp[xV+o{vBH/ǎ&|Ӓ_$ڷf@IkR*f: I&&$%6jD'/T=I:$_O(u,]alnoh8X=%(VA.]'ll/{}t>}NBtHm|<\`J@ x?&9SwǶW?{7}>Dz:;.DTHcVYᶇ.,F/F^ZQ^.ѩ[iO^RYNUR}?ɴFP MM7%۸ s#t- =Yzy;u\u\s2:!h #cE(}ڀFڀ2ùlM)dt9p'Fq J%.2C6s @X8֣aa4? qKYahB=OK OomKn'Cqu٦ ޳P6][M{3/7Nެ'BmVàH܅B|t$pI2l$_!2݋))ˀȞ1F{CSb"er[*^xY,^V 0-U+#3#"3wK&]rFWk-7O#{}'C[ՙ#_{ݴ^iHb/6^m/}fKzOPs<|_n|&*d#Yͭ3&8\ 8\ .ǒkpJM3̵d4o=Fq?uOM'O|8x?| O7ۍxH'7^ oNg[ urjzezS)equoqݮe+&4 &ԉ~+ͺ\ꏭs픎Eo¬X&EobћX&g9dLXFHjbћX&EobћXFDJhbћX&EobћX&EobћX&EobћX&}ob \#6h_ ?hd\qicK;myAYCRRTY_{_Z"ܢ/FD]ɖl ǿ0s)1`\{` D8.DxP OɐYFy:鉶6.Et 6&gJ_o>?_;On\oT=~B0믐4q+|AV<)Nmalq{1Q䶥v+--=5)j,RZDN8QkAq+x%,Q:db#m5jԡuK"~.':aȘB=8>vxU=ܖQ&BbAZcJI$ߔ1H1h)7֊:J"9PXj#jkCg:EKbLx-f {#dz6uaaLΥ()AL(4RARS.*<#yHZ7T/!-*r=WI 1ʊdNlDw_3NA*UЖmwBeja}Dm.E2[j1~DN!l`OL / W?'M.;,[!xT5 |l=P&kK%)ը daPN 9Ts ܝOc{$bVQ w6#P(kfiQ1E[V@T~O=b v+EV%T CB4Oمdo܂RMs#z/cIE'A4C-SE/%ywʗff8<ї'P/" CY[uHĈ@+~\hUSzJIJ .JE+Ё#9$.XAO5&iH)5Jݮ#c\!YHԈ{ 5S0Kr1d)J⨌**B1::9'T-NAl ϭo5Ԇ #dml`#r%ʄo$6qEx0sK*0լF0u^uiy.UߓB 0e$/HIR(N8S$&I>V/D-_]딎ꀍ0:Q#RRj"ȝ "tJT@A!THV<bwPzqg-d;DZ}`ŽRJ{le8QoO#R֙O'w׽n^eSWĩ]ܧ.WaY⸵\,yx8inm|U),k{sN!#uw3ks+/Q=x;0;eeVlf6.Yl&ehVM[N\oS eaM^tU odX g+HO'=qUycxCuKSJpR&% I'x[ r`eNP Aznl Y˭M*+3⩣FksN?"<EQ RK'@,҉̛5/)j>:}Gr{b:ll{ ӂn y1/'??y Zn}IʪϏ0nڤ䴅غ'cG[/@=tZF/ dwI;TPu36"$)xTGɷ]]㼚ˍz]kz1?PNdgF1ܓb(Pj mNy)VHZxUe#U*$LaxSH^)#2S ๞`QjyoYޖQv:5ٟOk%`q܄GNs?ě$ɠFp$h9p#j_la"nbe=yr D1 ?`KV糖wv^Y]Մ)ONiKb³kjTh}5W9-C"ԀUrXn7Mה/XEؔK+--5>Wf'87AH%>J7LFWM)$,P*׻06H WƁA(I!F RњYQ<~݁5ϼ[m@KLx)U$ף 62ƻzP(LGIc(pi NxfZ%ka1$UQC]Bd:ԶӘGU wZ-5".(*@9.=MuIDChhh,WTֶp?H[& A9pO@%7[roS^-j0$G M. Z^8l!)axhDh~7)"vU SY-LBQ (8c:nB *PIJ Y\'SHD_oTU!r"P@qS82Q1xΒj}ݵ0kg`RDCZ`.'bZrI0V0<`bJE0`Ax,(f@#bJJE$JiU{W]mz{H\v2$.fn?ZqBù΁Z(`K p0#LӔ8P ',xхoGi%&y8y|$,4D(|Q.QG@< I׍}T5',9=[NYc6͌2H kBL.DiGh-7w0L_0"MG!(jZػ iAdCx !jt xs}jcJ "N唌1B[E$14b-RCp‚P+ƛKw&]lTCp җaT/؏: 2ݣ~K q3~J(k&fb$T#sݽC>M'jRjTP53IUI怚 ZݽHv: ͘1 m t͚&{Y08GM_n/ʰ,5wR+b)ؔF PH)(eE I[Nؽ&525pAaWnXcRΥ ˣRSLu^s rslL j^#B/@aIc{cN@57 ^V=(%E)tݎi( \syg8ܔݟ rʰ fd 7,׆r)x(>Ѓk=@[ϔ%o6L X @xZcch-omPGC*7‭iU"bK(F}qoRМMkM  .&Pm}>¹3A*[?%AO.Y iU@SR//Y VYDp DžBd(*־2-Q^x4+qeKsu]nr9Î8ur?M|~5K=|tX,y*QhM,xrOKhlv~ E%Xܘ֎$Q=2Ѭ'?ߵ|WA_`Oa8Ӆy8 *a}}ʆc[ 4@-hvKL7Z?J宭)ϒtVI`~mj$- T]QQYw75y~>yct|knv#Xҽj Fp~{j7j5/\O;Ql ނ2(PFlp  0-35?0ߣgu>(rNɜ&4 xJ.Zۖ$O$K']u~NH#DuPpǦ 3}2 c#:#f)i.^@Y酐1ĪZ^è3]A[ 810KshD0KY0KT0{D04*cWSԡD%zJs~D*٣W@.Z!QK顋D%Gnp1B Wliڔ?d?:c'{mG&`i9,rk`FB"12Gٜ#˘⑤ pDԈ *(UGB>L)4wc|PE>GL i_$Yt9†?)ƗaE6ro 󨸓6dfC_WXgG;O{ V'r;#nr[bNߗ5]5L+vij尓Utp`3Mz' k xJ B \DPcH>AqdV䊎i+"w7!Tgmz4;nt>7EUWi4Y{Kq1HjR(ZSR"{,4G>R S%ň4ịUht@Q et ArD`p'aȹWŚgkXkt"u^ƶm%mf kK :yiUԂ"ȱ>g<)9aF3T"^GSBeKWOڡsz߁V~Qo܇j,-[)VGjMpV %V0&Yd,:(|F‚^pj Oc*̪,<} (^#ύ<^v'J:.O _:P]=dB[-=Dxfu]OR,Bf MT֔`U0(PA TU8F (lY=+UP;x >uC\KhfigɓH`)H"Q* J34(XG$w+=ˏņJl$Y f֍-֦քHB@dZoքdDh'09|lY xsw|4md6L'\1bBh98JI$y4J,#m5\HHM88%5aY7d};?Gr̄e[3;t}ij!|m.EeKd ; dB#L9BRP;6aݛqz6 ΊN[lxgBR!jr570!wVΞ]z*tv~ ]F&r9dW$,>UWHYWo^_Ûߟa^7/f`T$8#AI}7痿n<4CaTMMu5ᛌ|q߮w&**Z0_/zUgRIm']:N2kHA3? l~9M. >O=F 6?w ClE}k:"_!Pyœ3ZД1[Sy? :EINn'`(nOY؏YYT`#)ETkErP?1>m!PkHA`*`(Iʭ7NGe1V֑| VCJzx.M *ib#[ b#@ D8j52+"^B5r uP[Vw,hRmހz`-trZ[q#X?t{2?Ll۳LL^=NxaN CyQ%+nM78Ŝ[nu '1bePm ؍?d Ys zWͱn/&tK t.PHx3jg>OLj9+>_{5*X}Ѷv^KwֽKx0iXT:X]LZNDkͦ=y`A ySwr(K{YnLH3U!dQ%:WEVan0/an_49-r͙A6ƒEiCnqݶNsХm)o|aV(\9?&1/E@6 y/dx$xub[v/0;5EWw>~UˑQIfi'z &8PW>^{F}:\_=70\Yj#[ʾNk1JPRn`@oʳp![ |\Yj<DB/ فR@b:%otUW׆e]mSf;Ftzb0-)AݾseSw`Y~)/]zX%ZuuIu9) F>g ub KUwEB2X3pZ}6IF?v\a^sg:ʇ/gCIIAVʺKڦl xƘƛ)mݮm] yUe({A?YT)CdqJ@h(:'05v7gܽ,6O椖սK|ur`n%hVyچ4,Go<61ņc(i%cdӕ1dEGmd5ۜ6&Jo%k-}NjrJřH1!dt:cJlIijecDl&nD1󫎌n\}u6[%"Ƹ. VPeN!|щJE $6uZuS dJ Ay(Iw_`,fnlkZþT5şස}Jaf%74^_H<g4|1jr}u`)H9B8X1z~mĎq, U\ΥjپWe)E<:ngo)q:6jn bW#;hvb)¦bpw,X6>HA^ϏhWϠ/@@]PƔ&3)(oB }*Ēd*%9YnyMS -1RE-WX@:K$U!g$Fn)p`J?n3Eոq/@xrnuǟoO=RSs&˘Q;II}j:VH!(D8 "Ե\}5{PA]B-l'lI=]@gx.ӄRk[*EBK">!w6ħ՚%1:e&>V՛UnC|3>@>6Bπgو/A?O8}ʳn6*Fګ/`}trP lQ%339'يbV0Hb912-ñ6vG]]AwQo`Kݗlz6[[̔# H0F;|8xwZKU{,Ye=U h9jQ~2C)r7],\4@&mE+±)vڒ5:Rh~y(\Ow]k^nx~N!PX  jn. 1Yu h1Xj8Ri5;lkp@Ibףό,Yk8j2j=C+zsz:-Х?]7Hbo1@,ׁaIyMҐuT.T.ٻ2'GeNt/ʜ1^bbUYHQ)X--3S[ҁR豽|Z k|".V ) . d}aѣsQыsQ/cB(2z֛LP 2"zgltjH`4)x[=u9) F>0 ub KUwEB2L6v5go l~ Fݜ!nhPH зww[KHY1spviYr1rl>2G3鑐[E[$WP`O8bT*۞J膩pegm|?%a|n{?cptk#xeݩsuǚ~}_|uqӿî^Ksu|oo.|!*KtF6dŲHBw{FsY m&G@@:Iw2+̲H;c_W2%z ɣJvi[*Y8zGJG+*xyySW_ѝuv#m^'R!jG8}i؁<8Gu;LW2,xg6\h8U\#Υmj֋Me)庻bhnڦjgW,lઊ+UV˾UW ԔEy6pUU\JkTJp~8-9UaW,pUT0{+@9csK\21_|us?K_x ѿ0;]_v1C wL'UC#ҠJ}]oT̗4(gXUˈ>Y8."v1冹//~J'J/uaŜoFBwh"@I`|wE3n췟ol pYs R w܃?']L?]>+@X—F1a"8ۿ `%oٔ h첄A&z~F*/zv o(`?:ZpNk;v>PjC({3^%WR%Wն\ ~h9\UX\}*\CB J \Uqٰ*-UҊ!\y- lUU] (WzW?%PzǷٻ8n$W>FlYxd Gaž\43H$ZTnҖ47h(A Be~O+읆PAlR}g5 lȕ_+5`9v~Iѹ9jyP2Lk՘קCIJNTYH3Uy,}\1V* <9y9zzUv2/VG`ƒ~m[z=UfӸ@ ()Ę $/lM].810mXnov:QE(jWwoI:XRNo|NXl-LR[ !?Y䖳r{+˅?}y]~u_UյW v&Dv$*8):* tȄ5eϐ"#_s1arD\2 KvYTod&~dR^ iƱXLXxR,1B \Uއ'9`OgW8bg8.iŀTf] rY]t]ے ?FYĚ :ބ쫸{J1 WS*yُb7˱v783je bKhIeͼSZ(P8*J]"j/:krQəb(9!Q\1$Nu̝8pԯ+&7`OK?ED"x/8X!j S(`@-usۼE"b -A|`uk=p&+N{[(%ŵ5VP3"vg?"~z_SGūuk1uvӒ#qTg\ '\2Te1d3 jd]S; qNjX&'\<.'ҎcMcG cWq :]!_(y ^:CQz:ݏּPIiGV>"CN~zy-K~jwwׁ\MNC:FrY!l>e19럐F|D .>F/ͶY6RXD^kmUڐ}GBr*K`v)QL]'UCf*)o,Ύ}>D"V5BP'Ψd}΄\ѝ=njzyq~l}w\H$9KY>²G4c[Ki{uCvzYlj`ҤWn\Q\H E7ڊr1Y$k.E/_ 5'LLbX\i6ftA޻X90>RAJQ_r LɓH#`c1uŪ="dSMm UF g5PT(\95h"G IŒS2Ws ׀%vPƒJ1#QbU ANkԁ%к7%2?m_/;3^ zhmSA˜^h"Ѵw٬[탿[X*-: PVF4S橦S*gmioỹ7;.Q SZX^CP)+r*bm C5 UJ37O{rέ Zִ|Yk᪢Q# )MJ0qԈX9:⏇i\qWk+qހjK<۽;$|>c\ F &VuV BIb8*Jcfv~2.ƻboc'xO(t'nR}+z|v^A\#gc)F@svzme晼^^\АaÒbʗv)3uДKtJ8mX~Eh7˛#>;Q|Q˥1)Q 9)/ƫ%s\ X\AZd*lՇPA `2*jbx畭;Hz#Ojg[N?&\_O-ўO^*'n&{[c/ū٧YK=;4D7kU.U,Ȧ7^^,pV./@笧Al[őFh)K(sKPit>cxB B  0pHDV9vFчb ՓP[+7HSXVYqL`GovՖa4J|zT/޵=|vȈݷYEwmHb~>Oc+L!B-)@5*c s/Z{bvZ:DbM2BtBF97w1r8޾0rzAnJ*|Ar1ʷUFznV?XoS-6{(A'ԣHQo,PRotraH6 `֖]A@TU!쪩EмĨk*=\/R {3ҧ4&'ȴJF6qBy]:j+Q'a@nh`nw6scNk2YjEZY 2AQ{Oɘ|bjB^GZx%t уMX~*9z{F>m*d- qP Z] [P!1PldVYb 4K PP+N< Oٹ&~{皝6:P|s׀G}VM+'!PH[,xДZSKlQ3 ^T W˱t~v aygqqZB)HPsʳ$/7tlYxt-k@]YTb?៯sa XEMPĜ~zMk ÝJ weX<vq}Əs.r{r~kIrþӖg|#ߋ7ݓ#Iѻe)CUC^Ύ @U㠓 C1iʋ1)rⶸFl+8.p½x{6d4ȺЈٮgoM?׳4b&jx|-&1BΟ'k}Rj!ub=ca#P8B7Ot[7F6Ko]'vBȣ`Q>w3bQ$v0:!:p`_ I!{C5 `UG 0N$)$G׿ZRˋ3Y=)~Տa'"5j6m ?(6fQ]Xޜ_4o>/72-sY -9w;}lKB wqO, ra=|ͅQ 7V7 GP{2zN\J'JB V {4'[2Tkт,}bG؜@\T\4HWU@,9b1dXNE-W}{.KRVC"PD61 P*૶5kexG:oujKl̮_ aִpykye2jն$&ܺ~ -s m=lTrj>׋{ҽ9w)2þI7\r#B33>Տ"Iq)6zXWzXټѸwn7okz;oܭ[_ݟ5fp_~e:MZcܤtz 6+lVsCމ2AYb03t.z8̞hFK€%a80hC]bD][oG+^`wU_ 񮍓x8!b,Z,PġhqHQg5=]_Uw}pSdsh &S4 ,qT8C]薩/Rf١`?uS/V#i{tC2&QoEya'q6D@Pc4H`.je7:<> /jT1u:& 6-}aBm+w6},~% \(DȧՋrEm/G_^TٵC$xV602_[-{9$nYRSi!h- iKJ³-@r;D:A#UH# ' gՎs2DFӥ)cYw8w%Vp>us{],]Ϸ~tY=ў;{*p&-\rI&,O$@h=ba9$\ 3Am㮈= )RJ3U&i*HF+JjDQr; V &2=D>TU6m9NebL`9=UŤM+Bq-?ݚUI;@T0$SzA)BJ GP rQqɓ@ҺWn>Ý}k `eyq { cB*%M8Tr1GD>`- I=J8/SFxqJV_*5ysl߯ l 3q8%d]mTż> SzP5aFgjsa&̉Q5\\i:Ȏf{G/EY;maO[{z:i놵wYk7=*bqip(|<:lzrd8]rZ^l]jH'y#6:;> گ4Oj/|C%Ή߫ Gt淟~oo2}W|78ӒwMCkpyEu謋k*]Co]*._߷ 2C;T ? ~>~~N|l?yjլc0OPlT\p2#J>/'THq$&D#h λq_SH^+|GZ瀮spe8USPNl \6?l6 ?k QG Dk^]kN~O@v$\IA-JHN2| ԛ&RꝺmGG\.EHTmv4a$a$$ NtvǠDz*2M$surDPlBTI=K2,$gf:[Q;~kzK3X 0`a[1A#86,e䩺TTU'.`A,HQhTB -8S$yOzT:vJF)#y􈔔ȝ "D(ăP y#Yzz5[g zlj_#{?'3 d=l+={ ǥr%M<&=qAx/Gp FLk^ Y o'=qձm1ۮc"VDZ5 28MHB3y 1Ka;"vb!䬠BY ƆPVRF/ Ta\me(7Nls<ǹ *pJR|7 ^6M VP*76H W "PBZ-$`=FFw%@ÿp:>%>Ϝ 2W07*3V#:ڠ#M4 LAP(b: AvIS3j{G ႑׏!hV{qD%cedm~&"}$vLZLPҝ'."gZ%ka1* t@M)WJL̟ytK薻K shTE@P'3IBX j΍^vH9L跣OBZbc*-ݢ`;e4 4Wܫ 3N 3/I ]o|<5"E4 Ĉ73͓4RZ@}2>F@u\"$ #˓yQ2l9;CEΤ1ыs("VKDK 1<2hiN&җzZ ]B}(ڳĆyBȎRF<+p#F1Ɍӧ6f#)SXkq HEdRDkFDo bҡIHoƻ27HrK-S}Iiܛ-6Fwr9ƓPjSɆeځ9hD9cg^o7m&JYEaᆣeVwOY6a5孟Mi'#28K:jGa;݂98r>ݵJ :XRz#P:!\sy2u^ȝ6%"9N&%qb'8qc5; uA ._gچdPR(5Ap3NhZ3z׉3Ч$NO״~W> 75Gi/N'>9E7ӛzY\7u;z|.DDvϵJ3Vgtm$K\$Ŕ.w-k^NgZJGeף~t!Aj_,ΥHAۧ#+Q`Jl B?%)R!W=CR̗DcvTU'xaz烀\Vsi Khߕ`r) m$ aR s y\p\ ou(5Ɂs1[074oB;YLͣwV~ڿj\BSvʊjcFEmk'AQY4BOU =Vt 1 ʷb?\ .pxbSxCpH Eη)WNgoXU-bdدx_oP~Uq=߽lշি4=eW廯F, Ng 5w޾ Xjn]SkR=̼0(αNHT<՜NpPK_f=,O>ً:U$Qg4FD!;--锑\ oL*b1J&ij"sD Dy &g rLv#gSI (ogueWSڀ_ŀ̶&-kcf %'송'C͘춥K1kIk3u@hs~=:B2OOdّ%\O6n.!<y0 ֌79.$si3/d O@.Ezu񧷮|)b(8Ngqs![g?mn6LfW=ۏ_y(rԡ-0|m5Pk+N#M;xށss[.qxBHjIA yLyn,!%.M`*Q. #e>fu ՇB!c2""-$[ suC JJCт@sQ+ K.\w=BXOP /LQ-Sλ^otخkHz[s!<͋yBl*h`fb', #/j~-̿joygNOr0}(~A]>Ǒ+/?|8j-Qŝ%Q.$HYrYG/33_c:nMWL6 u2FxQ3hזIE[qm^{ȳef"pCltD=qsQ7MT~yJXFyYj'āk#Ol%:J>YD&ƨIV(e"q{[ ""plX.e 빱1$I& O5ZkUSRJl|Bl%fu|>)ՠw_3W-;K,5%-Tui9$#yU +NZQk20B `Ў"i!+ag_\bm1}XB>z|>魾I*܌Q!RTR[qS3F 7̑hĠcˉVhV7T$52Km`u4[mMrALhR /EBa Kp1r6KRspZzw`嫋O]ܦhVWeL%Tk =:VUr⍒5u.Ԑ(ԹLJ+&VuVs+Y$'=.sSV؉wZiyL▅(5\j*4mђ:&$<"ϭE|i LCZ3H#…jǹ2D&mߡ%JS0$@a\- n&D5י ]Wɱ TO.oF|5)k桧V5n!1Yhl%K.Jp""%vBGl9$8{ ui,@ϔL)ji 6 A*hQ"Kr/\yl ,cB܉|na݆-6ϧ&_YS9a8gcK(&D0c!_9@:r܋h۾/n/xk\1+\1 CrW3R Rnq%ʠ)W=黷\q* " A'SO[O N5l3(ԓf^SWÒyTEc^﫟?~]|vmNݮVE]Y]C_AQ/ۄQ'C4Tm*~D(H?C<  O>.]Xk#=x0}D-ézREI~;͠YNO`h>]Uc_%p "<hͅW}Y?"җmBJ$P4 I\`e>Gԛ&"uO퓇DMy9>yhVfI\#,W`K ip QJ$ʨ"4L.ઓ#;*r5G}s#xb1Bv|rB9H<01GQ{5;_us [=Of`gĮF^Fj YA KH!Հv™:1T]B菴SRG:I6JPɣED; D锨ăC YͯIKi~q/{Ꚁ' }c( 3_Kצɲwlb3A\C>C'LTgq\  H9w5n2k:^)x:x"#;usu1;-[v]{Y^ox`27i=RA .o{Ml꤅WPT90Ɠw#2D%]/*wVee)r@ŭ DͣJFX 8K1DѦR]}3WP E9:Zɂɹڒh zτc:"Ajp[jFU !9Grl:$NBd 6Q$R*&TR5c1r6krX.,ՅPXNT.T_gdT{q.Уۑ~\ 7kdBTȀ:Ǣ *EQg&$QKbek!{^p&K@8(fra`"|L+]X)r6kl79s_v1xv`lAQ 'cI LIGJ+U"lRDRYﴠ2"C hE{,2%ǘu!@<չKa}X1-b)) n^Ѥd\ǟ։4vΨJ0I9e5:Or_$Oz⁙C'2p]o#7WK"Iv12I~ٝ sHg|Aj=ܶ%[-GݣF&?U"Y{,w"0m*'DFɢ9GC۷:/n7X=qy"Eo¼"g_-au!Un,'/$8<գBXQ!-5 vh.Vl9u t /#JsWE PaDiަ\+ʸK6N?A=:Â9Jzv&ZM•=P4mhOJv2s Zd!`M[c!\b!ڗ=TTbL5c @2qtX{"\mKW*JCWP2,JttoSP-+l ]!\BWVT+Fm+ {޾Z.m Q gL69-thn:]!J9HWW-+ .m+DI QvIҕ֢=U+DkYJȤS++Tt2=j.$\5pm#Vx]χ̍|Z POA>؊9藏8zmQeD@Ta ZA8 @ozë+kyJj̡W<翼۝nj:;DL /,{&V%6i졄AӫQ>^W@& vdQk)˺mԒ%ľR1TAxg\]n>`D6-3`hkf.oM3 Rnfg4oi]!\KBWVtBTwtut7.6E`" ֬ ZuDi+P~mO@O$KmWEi䃽\kƈm{ϺK곂HզmpihEKM.̭X#U5-GZ ;RV+վ]Of05tp#m+D+Y Q6mEGWVpB5ՍU] ]qgly(]!`[CWQB\5d] ] )5k6+ص&XFVChGW'HWR f `ATk rB]$]Yx|(+ދ n@_EwR("4m卟Cv4}4T+[CW6 K] ]} 4EWWt(%+ao]!`%[CWWf4 rJhc箪8_w\ycS nT]zj1Et-++t[ jtBututŌs+65tp51m+D+H QV;:Fw75tph ]!+ah^J%ѭ+K[c]!cm>R莮NZ]!\jBWwNWRN$)6.a sі[5z+*卾JOxiQ-ۯZs^1U勰~]Uv|J˸i=ĔX}`%m?Jr*TGJZjG\A2hUnNqji B5̀VƯ3#J&::E2G滩vfDMUJpjwSIR.\[f1RoJtS/RFpq%E!jVjtKgT;v>'04J:#{򗎷FWP6-YW+]/+0+|I x8Mg( _~{6q<9q.WťgߢWDԛBU9Lql:; Pr.Uw. Cp.zuyu)/ۢp6q5eeq%Z眲^KLYLQe6yO*-S  4JGhKSonoպd\w|l^Qg7g?/8U}?,ܒUhF]ZW"AtFpXUXE­_5.XBE媌g}[LF9(œa,4a?r~ ?N?QkE*ĂA]lQ^!C7_ےn{yW“hx=ZkQz^qop#nX`A g~7ߏ'G+]Ir ( Bd.^rwG}/V)żjY?`3) Q0sÙ0$ªZհ~8-f>^K&T~?u~MKC}*JljSrh&NSXML7]1}sʳ07i~{ yߍ*k@]W20>?za(z]0hV |[}^.A8~j M Pâ`.&qgOL`Bп`ii)%Z-]\nGq5g#rj/SH_di'`iJ\gem^O'>MW1-8j U>B@;gyg{N npjgH^˔kH2<ၫ) ")TuSr))!jS^Pcp[*D,ΔH]VLm 災I`vVydu>0w})nǿzMKJSe)S甮Z\=2op>l!RD-qTL2'H!i"R `0֬VEMhˈU*Ѐll8.)c$<(f\"rˈZ&Ԭy^Q;\O,MӾqz +~9MV#UnG7G#)W6*Aj}}ǙL΋jkw+8J7gAMKYg%VFżSב( Iτ&82BP>DzVfTF7(cb`:fYt) N8qR $U73yfV k}P̅5"-fH8+q8}:,G퍿Hwh8=2fCLT޳$(#6Icx"u,AVg̀Ξΐ6A pf6 ,MN1!16ƧK>eD]3#;F|J5gJf)SNcXuȉ=6ͥ'jkeDk(h')8Yiʣ,#I%E5O&NS𼌚683]gr5i6)ٓMNj/>i adf)c(@&kMK!J9XDV RKbcĺc_>yg_e>| [ٜv6O1V,_].x .>5RKT<RmR%rFQ=E?)2E~eѡAlL_(4;q/|/v&\\0aP ODUb5g ϩw5|-u=~~{L0ym7޾z?3(W51i[.M6> :J)Fots\!Na`8{i9IHґEey)M"éYt0('4J$"kg61xF1xY׻M\NDq0R 9S1;OR8NPXPao1|%(Kof`ɇUO6Vpx:mYlG}C)Z2¸j}![BB$QϥO<$h(``΂^I]L痱3>RO<_HSDf$XƙYY~17% jGL&f 3yjb4]ܿ{pI718Gfk9PkAE&g3J'm!$1JѠ >C*$6X~7-GCρd !!5Ku8*2,E6 Kw<(hic^0(6 Aqj3F\q"eº|;1#WR*DYK~Y\z̮Yɻ*CP|S_{CkcQI^?(lVQSdZqˣ'a}wqyoii-_fm-K} مdz[ǎݿOQ{(uFӔG G^SZjWdu/O'zM1r7 rt6(alK;{\#-_KWrooCrk&0wW 4g, 0>+Qf\Mlk4Cɛ+dS5}/3O͙bhsaYE~!>E|zxaf`\0V'-Rsd3yOr5] tx46;˄/ i'jg34q7]l!&f ̑}_,Ȣ;PzDn~74ƲV[n;~n-H[t\H,!hyqk烲{2'8cS.Dۜe7QtĬa3 i u fMq|CRdJfp-w?{Fre 1_)^3Yv&)1HF,&"e6Tn-vvUTG ;JtHh87z@J^+^/,o&] InM7v$ms ~@Ja} 3*Æɞ,JO^U&8ﺍbt) NLI:H<Gf]*"Q&E*LF&;Υu`~l"<ؙ{l^n62S2x;Yb>2l`&YBFgQ>2@B4E%rywgj^ד=&ݜ>W\p ڂ>ybS|9$àuZ7̢:3Ueh<_gjUI0V>< 7P~E)ԥ)Mơ(n? PG/c-7tO8äU֙GR.OsJ2EK„TZ7Y7[!o^/7\8ʀU] 'WMxm.o.6{-y5,}]\a9?һjclhuwT;_T}uH7ߙLG;ԕ'dxoN_ PwJt 5^%z7,|9Y;d]jdbTJb $hDFL4crR*Iki:t>I |g1OSpYxc5k$U!Q'h7\#S˖貌L;o-)lWb(tWncii2:L hu *DuPI"LkXllB{*/־c mɌU0^xMXcR:PDA +F.sJۊ.[lƳ|4DrPz2p؅?V33M!ed!q>_-㽠:!X3a&$=9R v zNB0IG!41PK=+Jb62Bi !]֥HDٲ6S<Ƙ/1(h5qLj8CK!  fxy}O8A*&!3ALQPvy|B&Wn$ @jNs .@A ,qK 'B#YsСHVF$2 TJ"E. Dr( ,z2PW~\V~tۊ3n.V/JxFn!"=2G]L&%$H,HJ#'>RY.b~1E"[5ݯ{rbݼ] [o^\啿WGLj 6r.i*(2UH V3j'd>\:=S¾.SՇebVPp6/7 Y,М nb;m~4 iv*m*ԭC]ZM-^|]vpNbg/Nfq&bh(tFZK9D˕1#-V:)}DPLlC$/\;A'S,솃6@Sg0-{"PbٛN&L3 'W Ȥ8U :N"pQf, p9θD\9uH^I.v`fڭȚwNXࣱ&' QQE QJtLy[ޛnMK=!}!_t 0.vzSciΧs_:0j\{t<34 šʯ#YK@@2NZqPZ,I4%` Gd9A ďFq2}?`y GCcMbf)wZV?|?_}X\b54 ?n/h0xLU`8<}xtQ%Գ߶}%W-BZȯj}Rkڹ TE`i^L"/%{'|G)}0;\|yXɒHID&^ ЮvG~} FXϴ'`43o SVet1J)u4RL*EbLgUֱg]|䓒ja60yo@b<L\ TiUgS {ͧwm9?*n{9?Xo} jowy&EGӔVm\fiz x2Qk嵽)ڬԯ+ !J|Tjpf4C{4Nwo|IVhy0wW_ݝyDG܋>p۳8zG*j֏_ߺo4}ħytN>b͗m[ϴ(h8wW><1ҒQ]@eN`c$1KoI撗7^'S/xQ^Aml]R5)Z)CUYbBΙI"))5Yjd{m}͎rk CH<¾>-~ fN6wx9l{={&R1~0d^mFKp]Apc) wWz=;שS&S&&+cSٌxFTJhЌ`ȂYZ21ϾJՇPb!i烓jd}3j< fL<@hŅK\;}Oˤ-'Ɂ$ #H!)A}~>?G;f։hryddzJ )HyN̛2Bfx O*ӃWKjtRzǣC3m#yX"q7={%il]ڂzF5Tʜp,R\危VU4y@ NxN8AFfCiՂe2{U18 8= nrrLkeP¡Q!p-[pk Hɘ@pb?&ˢgYcE(kip8x] ? {Ѽq-հ$m$B,Yc95yESaSRmg+BPcKKM&c̀RVXI=Ehj'-t(٘ސ5&_擼 i.6Gs>gWi0+?v\8?5TNU\lp? +[eG"*Gip&>/ Mf^=?|R_2_x &aLYIg (S3gzvL37kuw6$x},Y^5&Y9X\\Vy&m }?dΗiFϋѼϘ.8pQv^}Lw&(Ⱦ[O[d#))+%Z[^waurk]=4׊K\BJU`㙖wmFڮʻ·7oa\ s> Gjxq[avmUmͼ{FJ7@iSO\K[{nxs77v,c9 GbYǣzg}]gtc~Ȧ^UC:+ұDxJ73숿bgl0^|6FѨBhqyKT~w~qsOo߿WS(!&M"Xz;Luuoڻkka`eO}m>rC?l~0ʀ0ӷpJƻI NʡxX@WGq믠h׋W ,T2Ppo:VSA 4 д/}9k ؆NtHMv@CV<`2'5}c`$ֈ9f &353U鮮J@R@t+2s ?4Cb6y4?['LzQHY 5`؝ĜL t-lP+%;.BuW'oЉ|Ĉ8xZd75IaSzuޚMī;8o,VJ%9`+Sze`J_NȾV_`kHWlVn0pU%=wV:= +F#`0* \Uk;\U+XE•7鎂݁Hzv{ ƕُ;w%|/L|=tAB/?|\iϛP{&@dG߷V(3 fLk P`}iw 0b`)fW\C+;\U+•U\W\CRክT; !#\r(Ġ WjR}j%p=Oxf.ìϴ"0+Mʎpu'T;Kj\$t]%2U-/7iu?NtoU~*!0,% |hj2mluS(EXJow\˝I/X7MM$U1}6NūCnu&~/}0 K"I\2HE81( CZ;ȹh x jYPj-ؾoJ%ol?= by( \Us`xbZhW?6C<`ݳS?\WZ'W6^ \!I48 bIG Vs4$jwV3 +"BaW`U5׈U~i$1 WNbݻg.%jT2U[uuyѕ-4/NkvuԺMskWtLge\o?[99_t3;u3r\D5"FlIy}}YwCؠ6J2Yq]lﳍMMPp֟ +QJӵAnipΗ%9cY"'X~΋$oo{U wQ~?X"kޤw).Kr貏mHs.BSd *ghDO'y/$_>0at}rbۯ.O>$]rT2Zd` -Qioٍo|H^NCRb* bM6.b<%Nხ$>!3j3X4Yל_M6;݁ yp{?Gt;)= tc|xS}p/.*f+k. PT%)61&e VژViPPם~?;W}՘m2<=[q}Q#Uggyۯsen:4m)DZmtnm#=򨂩v܎L&n;nT]RvU} W >@]p֣4Q7ZaKt|Й\ Ne~~ 4 Ȳym#,zGr)zfJٖ.IʆΪȂ1JG̥`Me:or9k* ,S@P!Eֺ Hə$%w Rw\[3sq/do&{@~cGz FX+AĈZWc1)3 G9y4P3 dmX#8zLPbAb"s:P ;MM.iLK ,_X]}Kc)_VZ:/1~{;pg\yW|Luxغ'퇁Xferx2z`DclFJb <'L"TBuE[ГL*.sIV3s#cw\3,3B 刅{[~n_YѲ5".'>r;qI]ؤ#X%B2* mI;2YWBL Y(SVVPŞgѫ%+l*mFHW[9%.%"#vgG0;EmiDޫْ!=x 6E273Z䢘$XÇi_mpYH%@V YEG cM"j#cC*DdQagÕݩ)W`썦?ED1"GD 3"1yA& :X2,lQATfQuGAƙP+ڀJe8R6FV,-7&Xv?W:;#qEǸhG\q^&hHud9ԣJ6[l JK!Z Q3"6dGhĮX6hj+1w綖]֑8:.XI*\ϟp{,/oN]3gu2ͅ_i6' @I=RtNĞHaPBhC~9o=֒1 O#Gs2Xi-9 +)'s ~Yyt\cXdoՖPGEQA:4A(ъ´PH1zck>z_~ױEh_۪LӁ4 hU]o;?m>#t0 +m0{%}AHdv5S i5oy̅N <κ_,y&\(ĢIE"C3% sv ,|"Zt%k%5y`,G@g'VʠzĚہ?M[fkqqJt5H_:۪ew7܅|'=. FxS J'j<IΈsuywYJXW{ao}nV!{%w-|*|Jv:uotɭ3~HMZ?ށ!(ar}4lҀM9"i!dU,ԧ\Qf."3CDfȀR35r) +46h]r uT,T:&Bf*L)Q9(Ķ\AF(EZ h#Xv]ê+ssDFB)G:hO]4uGS3siO&6)3QLJ)& fgjV(,̈́Z*CSSj`@B)BBy2,)z L.C؝2m?7ˏ&2 ' UՒlh `YGґ"ZWD%16P[bZ *ilJT06J)XGUU<ê:&םs`Z<\g>#31 Or.6zwyΌ|ۻu;{[[,me601(}UeSUBeNiE)՗7nlaxK!V Иu-T0b0.KbaPjŘ`ǵNj!>!oZ|ۚv<`&M=g[aqi6K-O9ko #;VlX U/|cw܍כ_mj.C%Cz)Z]?ȎR ^xsз/E&ec- 5{!dEc]EhCkGVC9GXE 1xW0P~Ә)8t!8]tsͼy막rZ죟;1ҴtAj 8?tU_?wbz齽p Du5weNkju rZ Ӯ~>|٫O_UsvJs˪:ivjr|z"+¾4y[<۴ )`-ؖT |{G#idk$]P!<ߢۘcn6Z\]pKzxXerMV MSϸiRm 'w<6)Q3  zYޱ٨jt@^o^eV{ բ3lvۼub+glgYϼUnm 7хߛ1^jh)ϲ7_<ޗ9V'(-!rM j:/M V\S&%ћ#:1G'>#_LG>_h̋yfzA AI)2QIH>nT;F=A͉ɡ9Чӻɩ`v/]c" CϪ7,HQyOH# KJzC8H07k8&fj>uL+ĹUs=n }Qx,V«.n`r ‰/+t(E'nH?`VIѳo7Yͯ "tfsWuoͶW[H~2fnu'e݉ݹ`rLm$ aR sc5E@nA28]䍳s;[ōٕRr$A duBh>׹ԁFÕ j N-Bmh^v;CN8^Ƙ(gA^ h5#iAIF[%yJHۜH\kJ=!&ji.*`/  F8Bdiױ?3rvX;NXk x%0.6KxZ 7ݕ ``z9ё#IGah/F  mUЍBHࡘNLPOQW$Y(ךdX Iؾl6x)d<%v$, ;ey ɽIcmrQ[j2$Wro8X Ձz0X`۸9}eg柣~i5B#>{UuǡD`C5Bf/j,ոFWĩWi/RGҔHM3* . Pę)DUhS ׉pS䄀IO꼦VDT L ﴴ SFre6GykdRۺ*Fi&"1@ :g(@,J@mrp"TɎmȩKMuZ蠓ʳrZVᖤY"nEZgn35޽dZY']ҭHڬzR,mO& S6]ysbֺd[WIֵmviylEAyύvڎb%%/:F/lоyqiIG܏w+T8BR#L JpSdsDL^di ,qP}uz=U{4x`/ye1 Ș&ֆ9NJl:ۡ%@hAH ׏踆]cQKu`3E:ݺ_f e"6-z$>r$2 'Ӽ^A #qޏblcI`k䡲䡳$Dbc$+28MH=-KPweN݂s!(\Xύ$!\6 eI$OeQ\u1%ki3rKsOS^/Zquqē䎔E#>g݇Y@.-%G׎ɋ/Ǥ1HXtP֊8E/ KT;ڇ`rGOhoMpˣ' :k8DFiLGH(M#hRld΃P֭j^d:CWp{MHQqK=^UecJICiQAkj7P![doHN;kd45hښI3J1&79wZ9樻|fW)kN1k+zH7T9,ǜUosSnjTf͆WSP:7Ņ,Z3~u% \( Q(r eF[ȆX}ȉȉEr~Eu{1LhaeZ /9e! f J+FF3@_Ti-^, j]hD:υfB)T#e ƃC[;E!2aj).(CS0$@~gh[ wQgy< dLȔHGw(_>iqxWNR\_A=PJ5HiQrQqcDҺHn`hvW1!vFl~]-հ0vR)uZb6=#"r}e)F2!}X8zpeYΧ7EjMZM0YL6KB(Uq+s[fͳW[1{SeǦp "~r,׶a(BVz71~^@HH3õ#X?M0}Zi2'܈!+W1|Ûtjͣ'6j\\2e|_Ƿ>w}WفwN`9roT"Osp4{<%~?O?|Ͽ?}L珟?:F˜`ȫI C] ͇vZt9zwt9q^3ڡ_ܭw_~ Kpڕ:ȗ&j,tza$74,M aM26HpQGzM`e,NSuy\~*7&c(?k+ \xeط[[^2O=[5I \Ip:0s$'Y+()QovR:2?j,$>K2Kr1d)J⨌**B1:9{'T'N2N&d{빆xb0Bv64 xgLx$A^/Wԉ.<&DȀ}xqC5^jo5p(qTYfB 0E]ZXZD ` ')y`NE|VzHIN-`&(A #H5"%&2` ܙ!N HxSdbU}fQft]»w=L>]#v&r#i_*Ϣ,m:z3$ev*.u5ܺ=.͖gCy5e-[wv>yx{w|-7C.G@w(햇tξ㺚bᩍYoȍ]|`eC|nM`=t[I7gީsݼ%n3 sm /z^h?hd\qicO;ށۓ"ZU+T8BR#L JpSdsDL^di !nSp9GL<=*%_a#uu80:qBB0gIi Qg;4- $0@\*H\z<&T#ƀf8pP33),=rT@ dZ5xgl SKu`3ѐ)Q=>oǍF[-w-mJWW@H8bV7b633 <%eI&eII̗iP*J舶b ̪B2 ;ڄ]]3S)g1xK箯]QѡΕg~ڀZ- (P*0pF*EX2-(S|$c/FUmp̕N#zRƮ bl.b;boiI m10\[ ɪ QJ"V%QERyI|:|LBjmNa^!]vۮ~omW]ࢫv>mW]vۮ~omW]vۮ~o;]vۮ~o{|+˯T0j;6HRإzԥzޣT]u.źXwb]-2SGr~4}XUƋ:ۦ'^^q]ctfDj=D?"̳{\o\g^JѼy1)FcLIJp$Da* zY5-@%eT#6ݶWZTJICs9QG)"FXdXJTپm$ŸgB[xÔgm74c>?jIWR_\uB x>N>*= ݴ:BzTS\H1$pàAz'v( EbRxUPd(ɵYrb҆)%SdTP@`}frq0۞v1UOiǧ|~N}Pۇz_RIh]U"/ųL)[sF쾥1ɉ(H+AmS]Lv_ieb#WJa|3yt(me 6(, Ne~~ 㲴0=z[(x'g JBwIL*[mMVEIK)6U,]$ .THч1,"%g P&qgl_3svܺo@}GԜWCC_KJoz'N̄BVhkQ۶glu6P{!PbNbLL)X+(6vflw|6epX'C`8GD7RPlYPT{1ɚ ^l22TF|g[FGFϏ%M)*g72?6ܜHL0FDk)z Nb 'LMѐ-)D-3D( =9R$mSqA&lN}ajَbfXlf싅 eƒbEnxJRӸOY #vbpBD %B2kTqeMb8/d& cd3Ul6 dhlDљsR늂VlGp}(XPuc{% ! (ErlfEqK/hBg|i}W>Ҁ̐92dU5)hd vd>LCch$ e˻4A/UbUAr& k &:#lEq#(sxuȏ3Be"sк`9;/T[n//%; g#/6Ψ92L AJZڜJ+ݕ5!Ș*֎ 1Qi]EiULԡ D$ uEg V>6" H#?|Ӿ)"z[Է_k;PRG2V 6hTƪhTƪKG2Vt]e=I^K֓5tO_C\4 a ah!9[8J,R"E(P`LcsB'JBUt B /PB@a6!,9%6CdUh/}l9E%GE3s@ݸ'.[Ͳ> |ǾwW3NΏMWdpNʏZ.`t6!:Wx"Um*xB:y߸unf%k4/ٲTeDOZӭovl԰ʸ&]MOg/n^C,BP98]rDm!5dF!C`#H&mA)94r) +)9hcLTHȡ$9JWl2=Qz5]/8KG^&[c.B2i=5 >zJ!&pyɄjX:WwW_Z6|jh`R(EXEzy`S;˾4_lN☞ATzZlљL:rZ:탈䊐>$--JxwƅlQz*qy6or΅ۜwb&+"NpW>_W^z,ΩзlgiFak8&cPcT9d sU5pIJ;"\^L1^dI 0M#T0XTd9L({2bm~V*0&P#B5ir$ϷYnh3+68]c:,eNxj.$#!hMQ*"JF!8ݺ6 maz| NJak QrQCf^?8Yk|vEPmFTPԚ] ^xqяC:~̷o%X%pئO"!EFiGBk1r.F=aa|޴PMUq{o\J~X."KU7T%2)l_սia޴f p9WeL;q$ohmjފ߫ +?N~WbdzAƢ4;߁V};>zfXragO?4~rQ>\77Y釁&\zqJ, ѣ$|R&ןN> e-ήO/x}ʓm`݆'/1$uoroƥ<=~ןAY2R{z4N\RvԞ)/ӥg͓9֜' =4vߛN7.5%sF؎նcfywc}Vj0ﰞož/W`2؄[v -hGˮ[|z~L...Py1Qw56آ80қ,tHHAz7#39>@n.Q$)@@G3T:yy)DA#-9UZ2g.O7+ūG!F+bTk.bC] $HGWlhચ}1rj;\U+pJ`p4pU=4j5~#\1j.`Zqte1ه-! L_ONU'O~^AP}P7{6Meb-xس%PGcTNS@hN9-`q U6Ja jY:&IK6R5GTk-ѡ:%lGWl0\w,pUE9vV.W! wDpU ޕ-u$_G1jzI Q(j`hH \ZHI!@TUɓT+faQ1BWφJ]1? 7 ]_uz8]1J-ೠ?ziWz}@f4G)f/GGb0mvAϰsy{2xt&M/x{h>:Z|ƻv-ϯphI>ZNvwvo(nv._ynզpC|x򽺟tsO$ 1yp~S;?ל}sA]5lv,?#9ϖ0>UqdX>>5 ;}nO3p@mH߱o>?5Ly`rُW[w9?oz6GZG2Y]/Uz7_Ζ |v+=">6>B{_Io?grlJ@Nǭz7#T÷D=*);kZS鬜%>Wn.Epc+! حj.ԹZˇHZ3.R7M#s8F:>ُg ,ZJQ+ω6{Gb:0JHkf< /&W|(O]鞒r]{I6Kc8W[SZ@ Y;[\\הR#%IE/&3q5sƖjh4U4jYЂ= ̔ܽ7zhnmS1J](6NY2d/lt)CQi ԎƬr61b5)xٻ0F>8XԠDtP 6C Zb@$F=(aL! #A K7A=kt[XO%$' cEUJ=IkB\^ع7,/NE:?V11󃳣.0QCیZI|tx݃K<>o:@G6~*MT$];-6*d`fPDd4XvY. C ) >@(E&rZ#d^S1P>Pti,Ѽ<{:%+&CFPr#VH܁m CǷ.̪d!:T?Qy'*SL l;V+.$ `M0Iv<ݮnXn5̻>F9KP0 C8˾L <_⦗+ˋaeZ_r.߲Ƶ@͂ ]@H77kF֞5`ǿwΪ4QG]Gjx5ǤͨyHY#h4v˨ ƴʫ? @zV1im`QD;D;Lw[zP낂0dj*g':RC]֡7q1옍}u`V$5sJmhB $\a~G?C*aX0jH 1\U -Nu#Mg=13(:W!Xڨbj#cR=ڂ ڊ[M+5^tfzPkFQk&=Lb2Վ *WZ@׶䠧{[ ]1ܐmNWVE#]VS1Ȁb=ub ]=Crz8.b|FQf ]=#J!Lv4u<^߆ Wown׋nfBWξRvgۯ'KzPz\k_I߾w&bw~vxnZ4u=ޯx_*bu ~spbc~:/_.2B4*|7jXпSa~s:=;//soڗ}Lhqv}I̎<җ؝v\Fo߿Wە9KPZ1^7ntb/kK*-+i9 "SB^=zomdc~Ƽ;?Bz=˖! 5M:P&&ݹrΥ`u$2Gɮ'>+AL;Xd @)#ZZk˒r˔X^yjƒF%# ^gpƳ}K^T![7\>(x=Z@< ezo֙-hQzy-}J{DWlaI ]m4IU4Z}BW6=yugHWIm Q1A9ɋa7 (`[|aӿ,_GVN{EM荿Z?u?6o>$H ;c%]:[g۟->^-Vz=[|Ґd(Ioi|~ .i3:wGh)Tx@w_]wXuRvW?ywxATrc聸ˇ^5.}g8c<-Զlv[;5W/ocg~vg}8>jsʹD;)f!W3c{$0jkF1\7{lmx FdG[~w0 0x7=.] m|wB=-w+-tuٻ6nlW;64mhd@7gYjd;EΌeklIRTf(<8?\2co˻`Gf''&k3>g7ZHKA, +à zpIMΖS% {I~O-]XVYTnp/jƹ( O?'LO>҈^M]31 9m@`$ rRܢ*I]}R$㮒#]%)1=o]1@`B$g&g&>w~3!-πqWI\]%iww设Ewm۳ 8ì9焝J)Tƈ [0*= 7 Kr8n:K$-%$<.1|nZ2; w%C`쒴lQe 设w'Uḫ$.?w{ﮒ设Ew9a`@\fY}wWIJ*qWdî'sc$vWwWnrFڢ8bnRy'Kʢ%qd^&ΫɻKfbLd,3(]^Y뽸m+5^Y{Mr^>*@=VܯRϥgE}EMj!ͲlgǐYH^]۴hncmM2φn]SAOw1T)b+t@HmՀ;^sXeg)yLًcI07I5G~l 2'SrR<\+{}7PKѿ iwnaЙ@OX$,s߯o>< ݟ8aM?K705]ҳzY"z xE-%&U]My/)+˨YE`]o2)*hV~Jk|s87¸ RA/g97Vm/f2x~N?rcg޶ĥ.*T(uTuHykQ<ڡ2L H>|LMst}Gzt&m{>F\YwނQw2ggSs-M,jZhyQ 6UlhAͪ{ näs #+԰)pbgQ>F2cY\DRkmAi⯌|-%0X,R3̔12i`iXz ,!F|aTKVw.[lە,C^5?ǽlHSH tro#dK_VjA1wF(}\DT[; AHRVa@X1c2b=6M @I.grSw1a'2HA)uV^=4 'Y權嘞]ߚE2Ygy%a`xTQ͍QmQ'(߆${AOO6!5tM)N( WDS&FBWL$ IJ}-HZ|϶Gp| 2'.ӤT6Je!٠LfL )& mNE?HyLA[-o])p ژNcJ`"0 ܂ CGNogAH0BQRړVܱ8Cڣ5J5Ypߦbtg3T,O p瞊LS (`,`J!Ynu '1bePGuĎhTÍ"jaQ$*8ʝ7\ Ø8P=l:FY^1Tk%i|d1JxW㾂PyLz /i4i3KDov?DXo4Iږgx\%7ihvgXV8d>mpx/ܳ_fmou4)u@V! :ƌ_UF5s/9&r6ȹM[G# -J)'RHDc[ٯe/O_q_UZ[,mp}tZ=7chw4zj X< ;hxym5̴<[!>pb4 =Zi@Z!F |<v ^pj Oc*̪,<} LGnsj>&Άx{mc{Yu5@uzX~6? ݲw&d y TP{Ausɹ\%@5] NKui*@xGDJNcRYlVf}AE$/RpB@q?TXŮu7>]ЖKc$!2_4YkBDh'09`g!m3G߉;'KW<4m4x#9 å C C (CE س/Qѐ5kiĥ\#r cj\GI$$Qͧ6LWm42?χ?j3.2!%+ܱW/Z$ͥ4#)L:Û$tatX_6%JƐ ߟ9srwe{`Ͳx .GY"i+OǤ)V@"a~RLݩpiiz7[ 88!jq5 =wQdHZcq: c:pSԝ})IJв/&rY/е:-`RAWUMUmucE0CBqP*uLk1]{˫Iz C|&̥~Sl[m[(Uz5)vqa!TW%1˦bHs14SWw`> LF;m|^f7YW9%K%h}J6TW(7i:FRḸ*;iF ΰ\ RTN̬7(_w\ou_]~ϯ绗x狗]`.^Ż_¨hIpIm- N?um MmR4oUoRn YQUfcka@~z]\|jx9Jǣ՟45Jz;~<;ʢǺTa&ܥ/\ 8 4/:5Z iɓ%qĐS(F O`6LyAP) v:*][o#9v+B?%V[~=;.B{'v`ZĴCUZ֦i" ϬVESey&3 w5kD?v( _ , ?1+}3@L(ؖ. VQF"HE:9T&)UUQ,,rpR!8\d3˜ϡx&]y8;n>owP>*Y>LDς^ M$Dg4;Chr*O''7פӳo˿zyD$FHEpT 0kQ%vEiMh)N@&㴄_\e- ?eHڠ Phߐ0o!=˦w.fAbmT-6,>>/.%1E矖w~mvՇZ-p~3g D""(써pefzr9b::- {T_?3>=,,rjPg3c䡱L&H a_8R3KWT/V9 J61&#W)0żH2:/U51$XwsJ`M%p0R9)g \(E}]98-]_ WStYXLer5nptփ/m!ė? nS1d0̛A8V oWڎ4}Ƴlg%KO`$3X@4S97&4c" VPv(|·6L- {F.EfHyY3FU9PR*XV$[6e:C* yZцZ 89"!tsA]}{)[/ͼOby[>~ɘ[df+`@>%t|JA9$s+m  Pr  ? 0hL{0k@f2o!BP#5DrI$u6"pV[vR!11x[MmZYvHDw^eD Hf 0hPE&R($/YͤZmUx&IG' ސGAƘ,KY@,:!Y;"|DEjI&JʑZ06PJS躏ge߫i0l<WF7qI'LsށڠqԀX͕BT&|V.*>h}5LYRݕ  mIew H ?AJ|^6zhf7r ڦO,aF"LrIm֎#f |,0t'5/)ֽ)"#w6d3x2 {2 }\_N84g'WArBjںK][ܱG F+3H24y 3`Q3^~}_ z[NݶwO].'g^%1Ҽ_aե|2nCI1\\dvgGꦣ)3,KWnn Olϣ<Y o'Gcr852Y@u_ro-Ƶ<>\qr~ Gю'kTQE R{tŏJ۵1 DbSugРk,3N~b_'^˥%siRdžtL?/ntM7[PJi.r|:@n 3#tr`fܺ"nhi֭;5϶yt5.WKU{v]6sŚ\K0 n>ڼF@GQum&SKP;ƠDّn|ڡ gɤ֡Dfȁ{v3lJz}ޟM# ?e'2jhqhZeKBLNIn"&6A)}s@ q3E_rWֺ`=_Yp=C?B_'X2#x ЏOg"MMuczȍo`\5< Py@˞e=n#{Fir\%"&(FBvNsC%T]vpRUt[t: Jޑ|T YMLK;%&-sBŹw>?XQ~{M 6mAx% {z;FpI7ciK`nsHW_ :_w ,;<,'lRV4vќڕE [_ Gw}ŷ۸ÁXc{Z0k'-Zg]3X|2,\פHZx蘞}c hoܯy"og\L-ö^so~zDqu0N/eOp6V'5!:ke& odrxsx~ +,^ $wqXL0eb2EVB ĺ1~ܺsQOE~cŗxO>Po=畨ftus h[tzx>7Wj@rߔ* )Fc4hL^+x(ZtN zZDZoQBO$*z5ʌ+mWC!e=.{2>:C/٨I}G5߄ϫ Q2< 4Zc1:J f,GxN;ε˸;,uk"{+o( &(* x%tŢw^4NhVUD<*2%O^h4"I*@$c6;kQq9@ 8)}nˏ8al-MR_Y7rǥ%/4c.AǨ)+ ̂A|U4-U~^,08"4;fMT!CLpǰ "HH0EHΨ|SQ|HgKLkH,€h5מLɘ!mfb~6̈́ p؞PZhЦ|3OD}Ϫb8R{dȕ]F̤WKWRu%p-WM!\,#qX]Q&6V%Ae F 흑cO߃$d xזuB'(A$V9=&.'r=pG(8։̨:O#1TP baCΠvpf_tHtp0>J=Q-k,Fs+Np 㣹(UKcjkWM|+u6%oꂗӓE:1[.gĜ0 'r;ۅ"\jßϧΫ&w/!zfeOB.tU7lu7fqGF q81 gI a4\t=?O 蕽2nz]5VmO$ƎGrϏMOR/F5ˍQkoMcaS\ .ǿׯ~x:~o{s#ѫ_w LqHhnWE¯#@݃k]׫7ZZkIײ ߤ_[r+}9T m֚8?~>r~OP]:ߑw& j̷0 UR5+m5U?.Rr01?j1(CM(ҕ1Mλ=$\xCR\+AQZPtPdK@Rd!2SCy}oN['2 Ir$Pw>"&4Y,䕴 XbgC[wxY_"?]cKziz\3:&(,A'@8N{e% SBDzs`UK {2a./ٷ}z "uJ=.Kl$h.jwEL V:QPg.`UU+ZU{>.zFAjf%|Ĩ7z]=?A&d,T;auZOz"I۳?(E_Kΐ7A(jB@pu:&|*7c]6I9X& 9 Y=%͟B橏!h%s*'Ӛ_Qs3q 89T4gr^l>ak)M+\+o}?\ڕ-Dnd[vCKy4Ɩ?ȀEQy#dA`AfL:&6#JȀ Am029nAȨN!$HΑ S}NC$t *J A42#gdbXXle Ea,d;,|P,Q6=ϒr:V6FoC-BѧA+$nJ(<5:2AcQQF*,l& w*YrlgÓiˑ2lrlM =L(J>&)Ĺ97# Â-]lueaԖ;Iku' c\&x@L+G%0KXU TR.I;#@ %C*PqktR㮈 #!׭x$X"ep"pʩ+b@$Ym\JGy("ZC@8JMJS`) UTThGMUK S"Fb܌N"j8̵MgVqQ¸hvŵF&A & Cp@V_(EH(Ya\-);\|\9+w6!dO=r^[ c|fMHv֏_)!ʧF}yߚYw&O2E=z:C[Tp)sZ=6z}j(#<lyrA OFsB""9n|TZɈƬY xca>HIY6i.%Vޔ@^?}Bo+4RzCr}ɶp':|}irpb%oAJAt`$ǵ$˜_:SEJ#JfcCAeMv$NMQwpFưOhR6*ive+1G_y_f Zs9SQ)b%|ȹ.mBEYa9%yY}lnyk5v`dn_M jqx{=YʐU&h{GnBW)#)xzvJ!SC)RHx83<)HCYv+Bc Bv"O!:A^W"FWFGR֨#kh" 0JGmV1J,/tŸ~+3m zYX%, DԭҚ]L V " K%@<3p[zXЛ8}0Ψ"9$@BL(mADc1&%C|w`0~4*XJpuZ ߆HD *q׍Q#Cmits48ۿ0T6!# /bq%eBQλ,9au?>ҽ_tBmAFpFakf uxo]¬hVQQuM&tߣA_\i["^1tUu&|h|Qgѐ ttvVնFmʆ䬡qL+)(]hjV4'd]ƣꃩHzxߤˆmT4ٜK/ 1\kU6nn7Ȇ7 ӓsGs޿bvqo"ʴ[G`/Q'.OX5iF dAKrىbVX̔>`RrGIWMjUJ9Tm7VY ?=Wu???`U`xGU=='yx4{۫wUx6OrB8 =.<.޾Sa>T" YHz glv_ocHyI^]v5#lKL1iNߟ S= mR)AlBVߟӮF`wcԷ/|^=MX9X FvVӒ|;*T L+!#,UWUl;mO,3mx|~6-ghw& >\G-+ݠ{VwC]|嗃 L"\DXIɟ9x]Tm:dbWBd7cϩo]z\lhLP?y`3Ep EB[j m^p#ٸֿs=[1Am nưN،y3Meb|f:5ugmu^;eNp3%wGw=ogp_4vv7{$Ҁ%D,ORye#wF- Tb 6%;Z/rC:2SPLA Cgì'j $B+%ELkA@vJNɈx!%p}:mUbkaȖV/+n=V=XWzH]KE5#'~n<]Hj@ "LG=6J]~<ݶ|R~H%NRZ* &D c1 Vg52;Gs`uMl&gh.P魧&Zc\H'Fd$SrӪ'j5==D\>X,^ӇlnuG_<Ț9<%x֙i d]bkmɲ4z? ti,w0/٠^L{bKnI{)JK$J"-G$.|_pO?Z]Pv~|y'/EKn)0[ mv]dUׯˣ|e6n'|/ٛN(9k^kzu5f_Ƿˬx3$ſk:Tmo~sF)F@|vDтl!wф\s_419M4/_?*:wUt$IVe6ӬRQ+m]d҈(S0# LP5T1w[l{oyB^Wb?ŨJF`ފ˼^)]L&%$%*K5H"h'ȐqB\sUEvö2D:uOz3MЭ+[#mniwᩋ?t=U>sIK(,'*Ahe!}(sjIu|WwU}\Lf)vT*j:\dz*e3!.rb6_s;]箩A^;Tǫ |r+e݉΋b۰h1JK(Gun-a"Zx<yPZOuq:K 㠎$ƒx> )HB&7W(E-E}^rI̭"0O 7($ؾv7ޛRZeWc$F ?y=5trsǯrW |lލ>wO1}hgm_}5eBH7K3Z>SkARIR?Y%/g6$8^a^w/nK{:L}~&C&Pee^FGj;Rym.r28 =14)Y0C7&#(18-"QV.(/"uw;xJ{Zlk N÷՛py4|-܃.|]rŨ$}}.p(u:K9}6MiyO2̭U7k+1ȵЦXL8Bsd j۫X.ۏWs4}k}fyC b畵6?pj;׮\ox߮{&6?ޟ/^?yG^xi/.mxu߇4}ėYqx?@ZpHs|#Pa{:[ڛ>::R48ZQN5'rgEAeypsIfD^H~Axgw>gK#&G)fDoTPʤV޺X1SOUPR{6u_ nxx2_ ]6!ԂZҏ``<Ң)e|n^"iA~0ca>b$ - $B„kϦ.ono՞bTڠ/:ybN- p@W_ea۔6w ji+`s gUn],r<{z3ң骠],;Xl*^Y$ayE 6ۼ`aPiFyϼ2p1Mw(M5=Z>Z@ekjDiٰp[ Z3+H B4CS]+D@W'HWF듺Fōъ޼L4ËS+ ]`Mo JBte%tCWsQ1t`y*G޻ f(] t[eٽr5SuMhe `U?,a~:}B1n-"ԛKr}Q 2,Wi㶷sj@F%ѽfKm_runF5y>pp33#ƒ4}+Dkl P@WCWRAH  BJQ]+@)@WHW -M zCWWt(m@WCWq]!`{CW:F]!ZyBBtutco;yܶdD]Hz2J)WmnM(}r+'uFvpO t+KE_ ruBbV9EDQ''3֟W (t(Еh8c: `j3B6C6C;t%v)7tpoZͺNWЁND? "BWd:ET'BBn՝w] ] j3ѳmWF]!Zyu(Jq} (ɇ0҂&%_R'Fe)k-:/>\IM>W1Ixez76 FvEGe=/'X_8a)Jk"IPxrḘ$&uPڏ?\4lᐵ4.I#3 >Gekeg4鏒%ʫ88g?qvL.434WRLe ^4̮_scJ)Vs?Ek|>0_~ߍjfj>m2F&0f]d_V,{?f*\,{ Y\~F’os.Lԇ.l\Q|KX o>tr\'7F͓stgx z)^5xƂ`_WgowF>)<>y$?^QjP 6TMvv/ȳɪ%GF^1z31ҟMJypa$3xla:{˳J-ernU!͍6ѯEWfB &!:/G}_פ|̀S(PxjY7wtaN׿{Bލ&a5ZjwE{r֓ubFBĔxlЏ#d)cL0IF( ftv@Fg617/8(̇볁f׬2x:C8C!fH  :`g)Oc.*5X@Uv>29٧Rħ„3o+WGs>M/%[v]Agom>450ҋFfg[]d23}&4T1.4{w&{~Y>Eb+634n~6uUVRWe7u};}o8W=C=b3 ,5#./aToz%k 2 }Lp`;TxwU &goo3z`R]7Ruqzua Fk_s<ޮèo/K`"b_Ms.݄qJa1S0F yS-{9 {@DZZAMM95j};IػpAv ,+V{@X Q< 5ESLq-| nyg]^o| \P(tZARʢi{ 'C9h-s_Hj/?ǿl&,<1A! o$fMnT-Ie@R8e*. jZFԆ'H(J8ήDO&4 d~$jџKT2r M| ,iVG|>Oe{,C)CAfИQYN4'nm&Ef Ǿ.m G[sVEKSH ̓qaTOsgFc2(:1 S띱Vc&ye4zl5HD4hͱ)up8Q Mwz6\L&i}4Zgкjw7t:k>Yt-3hMun-7w>5Vyz~H\QѼh&sDs֬dkVӯ|SP2r8^6K4ʥI D1 UZr|ru{Bnyu5/8aȵǩvH>EJ|$WΣluABn zٰg!L{)c.k$2 #(x2)2'ANzЊz-[BiK`p:ù%@!:{f Wy;aR*Ljd<)V:O+Cʘt{cliqp4rsD<#?7=:"W._˛Vn[ojfcİXk4 FKp&dcJ1E``/e'8%2b%0liX#gPWor̦+$ Z9^hJ_טkMȟ~ӖZ6KanglG]5A+ZΖ6eO-?1v0SGWR떑0ZYK0XF0ka^!`0XxTQ͍QmQ'($OoLmcЭmɄi҄[0$ 7nT|d_pΡ6"E̲m(9 hMmD# Dsv@?2A1B#5NBaaA+MR¬ ٨@4"ct4[Rb&y$,ViǬQFYJy>%lF5^h-mQT&r%J`<|dYU!֜K6w]\<6Yۄ1`U0(PA {Axg\r,WAI7@|,nuKV>*m #R"%1,r( J3K`@0kܧ( C>4.>z_yvR)|O佾MB""u˘LVǁL>O54"ŬwgR",͗sx:kZUV\Ϧ6 M`FPL+?4m*'7#Z~)*n; .1N3\/~u/;R=_GZɸ_tL,r$WtU7 Fa-A`*|T ^u&ddݨus)*Is`#䠿b+ypeK*S<ϲnra/~|Oo_|w>GLNj u+0.%#Lռ F؁;CO[njh*мYO|quSq窢0hnDO//e7wjџt`wJ[ b~5+]THm#˟ Q!ߧh\MEcK6WCoPb)#R0 PJ< (Jr Ŕ;UOҡ= r*L8⡩[ASPW2Mld`[ l ^FpX EA":tQ"u 4x2ܑ}kO,9֮`m }vgmΑ1*מb5ti:aZsI }›S^$1.'q&S,`3,:2_GZzSx$*Fr`I!zƌƁZ 1b"iOYI`<~㾂i H~>lwkLǒw~lԷOPSW~+vmyfz.Y; :/aN(E8&ƒ E![U;ẇ:q  㜷B :: z \cY ﬊>0c(Pʍ"6UӑaƸ#aREbD hl:JclBT&kvY9"Ꚍ9wɽr:rGH Wo'HЇ*r]ׁ]`hn]y]poiL3Fɴ>;'*aA'VI2&o[+pE-w./nuT|yϧw'ƪUm;ɨh]Uœ2]R+kA/ͫx[d4#u8\w+ SLžL/_2[ɳ &`ie,riAFB"q;^SA/h3$yxe`R=AA FaS#6ZFկo&N9[Fת쯠W~ ,j,z % Fx}T Nk]k^6c쑀]ks+,Jv/U*lŵɔ73/k O1E*lJ25}/͇,RIu==i 2O}F\k҉"7lyPrYtn8Ӽ}yӖՍTTCrzS>w+Y59pk@My1)]{KjûU(+?"椎V`rLVgspmIAL8Jio#D@&hQfZs$/Z)C^{-THƱ`c AR-lBj036UzƮ\ s!p\Dmv̳󛕟!ʛt^hX^9c8&ZE9q<4hDMrI$ N6{J ygqg6T^Nb&1 &DMj Ì&6l k7;vem0k{{-(d1WQ< P)Ib(Q3TiE(o!HuNB*pʐ΢= &D%L{1(ĺD '9wB|/2gbl͜+#Q3}=^L (k2N9ՖYmé+r@D'yڦHMh4EKRs(S<(`) k2JxL* >k0#~S;ŲMkyHü{^y^%EbJ` Q %-AGmNV$Ȁv`T^lNlwʇ \@aKu7,^Tc29~V9;k ~R7u[7,'wReZS_xE2  %Kmv1^nNX"Ixtlʩ3~0qRIꃏRx AzЬߢ' xʁ& *Õq`%)Ĩ5cs:A 10t|DHrneϩs)f^".VdX 6@uF4(wGncJRcP%b 9y! 4I(뒈)}4SkT#Ȝb2Š~9(Y>ƫr ,$G M. Z^8l!)a8hDD?'P8T q$`I(E@Q8c:nB *p/e{T9ADID߬ U@D 9hf(PiIaMQ娀fق7[#N $U #w _zd_-BNRfH>oi #ޘMf88"h"X5Zv1ߦl9^MO*sK MA{pw ya3e bs$a)TAHBg1}>LKshSQ(֢42KSIRHC"yڠJ}F "Cdj1/(Exhm*5OBQ|bKͻ⣗/#1)ϋKLPk;UU\9:n"o3HƵuEv):]8DIz'I-6w·_a Ym?7ľ<͋kDnIGH`h#M"̺O5 e)ujPzY19牻62R?H<rhWN~I;dL+8@T>INǕIxϣ&4NDTHl{aܗn&b`=1 1ؔQy͉epLd Br8^.bI)Fj.61oolv<.Qhr\[Xj--nQ?0ϝ%JU%{E"u\Ƚ£5ziΙI-8bLgC/>RO!={wg;8 hY2k;›0.JwaiSW㚞vR^Ȩ4/! O_P=*ۇ=BoVRjDηdOkU)K r4V/ӻ6VZ5C~6TyN1\^VdQ}{kEbY4Q̅sUҵV@Gtr1+8.<]F%قA1 Y@lא~k/y>}r'~r"'kRyz=8km͝Q]٧VFDr;פ|Kms!d|w±o|gA)(qe:/_Qֹ^[;|{5 ȽJz<âynTN[Dۜ~6?.|;/'ٙ暿]aړ@>|\x8=^w.",W^˿<) 'ގlu#e rB|>%eE<WlQbNqu9tփ*߄-gT 7Q ("׌ l^EǩYmpfm/鄳j8Z._!Ok8K^'W`25Ǹ_ΖBWFI8/< e( MkCWLlT}1Cq5$aJ3zMq]ރ^c\+pXl\>P2{Ekٷy!.gHuƜE- Tz{?=bY 3޺L{弛yf:7#F\Mr !/$#fpu/PKm!*@,t|bO/xˣ_օZ- [jMH0T&Ypc`%^6Qh:ꜳ&'=iee&PV; Tx㩎F&gA牖 yj:+sSpsҪWpjvqjks{csVq\ȉE[>xb[$ŢuZE0m ҫ^e=[ɍ|<ˢU1@'v]7o߽g4E޽~OWى#; %^L7;'h_>fi|\nؾ?yhŞwEy(MqvQ!#qoYwV\y[8PE ySDtV$6Yh]p^v؃gfy{Kɗ]U*nX8,lPV~W- 敟-wl,>`39P;\Mm~A:'[. 7n&WKv<rgR+,>ɝ+uatŤ z:B#;DW<7 ;*v(%tutSɺDWE,]v~tQ ҕP!Bp9]!\:3D+Di? 8`H߂aY6j&ln\HR gsSQ?+Y)ϩSVN|]Q$OyFk s-]~Y$(H9wX# cC*gc2FviYjBhwVB3\ƺֻdk~ixSn1JBv2ZzD g>FZQ:DW1pEg*խ+D>tu]eQҕaCUlXg d0eWW%W=]ҏ8qz`^z)p/<#Ibf6D";LbgkQ_R]׊tϲ2B7U+anf`P%ˊml22ujZ!qE̻߽{6@&OC_'D9X04L`D$ LK+J_G&4?Jlu6&zVj hGmU m6VQk;t^csz^q߆.z&nI'=[7,K .^,{ΤI5P1>x :{82S{`ŌA4z^9]}a )f2dW e0;. b%@{&CDT".DpCLDK9 Q Ƒ4z줹ORRǦ@'+z~sXbգYa[*JP"Ywjxu1þzP_KU>Z\K2=xmK:0ZF K}mWf8+\n߭|[+`#oz ƹz6Yg& jȇ?9xtgb:^ÙK%YIx-ҖZQ`촶Qh2S!{c< o'1؋dqwh֤+f>AGz|ԻsKгʾ]aWw>wL)Her@`01EztG4qK:M\6œ>>OL5#'r_ tzӫ:p(k)!il"3&41x.k. Kx1kA=$j:{~r9QJncΩ =Rc! ‹, B}T<:|-vjRqTQ[ZGEdQ0A[R.k%KƮMn҂nh2n)q{lVeiچ⧩;3_֝yzYkr:Dald; W u4.*n4vR2,2 >NYB慆Aj5s*ML#2,g"ƈ<=D 6ߣ&lZDp~[mAZ~Ri>5SmQ_t)CF>zXT =̉"tD谋0%\F,B0`xd!R&R/5eDD+1h+*RD;ם\hj&u)׵|v7ɼɇaO?9^nk>UE[BE}AV209CR inRȝrM ,[Bz!E#\-CR b01r%t"aDX |JamQJJ4J%&/.D):tZ2(Ҕ^ɻ: 56"TZ8Ҭj8 a_K,AId 5v'w+|:'ѪzN[ rPhZA( a:*ȣ OG'uFiݟ7?,#G5K՘kAY$bF3 RC.m\d !J >B-8>I}gRrEXoBt^knh3dGx %td}o\,)`G#bށ1ύ`HD1:\C"wRe^Ǧּ{F[)1Fh+9.a6XDh/"v`-ʖ0ONXZ> a(? p(fuDra6."K22#(|7nƭ0vFeV)+h&`g?&2 Cum~Jo_tv[M7IF.*J> -ڬ]v2Ҹ.QFR!7ϧQ^FUq+!$4 qR"g RˢB?4}PvtT%;1,0aZxٝ!3^^Fw7p~g.\)`u}7PPf:kQT'M!yOÎvT[HMΥE&p_ 7ўcq*?vc?rܾ?&|<5H/Ft2:TFeVaAb4~y_-~_|p2@Uh`|]`d//QyѠ~ϗxOE}YױC0?N/c^x83<w ^/e ;5?rgF.)I G@k R/wT&Gg*\2ᚋ_W")O_Ѽkd&4S&#؁k %gCלӆʯHaw4>ir,cT@˕(ۇw3lp jv@CF=Xu)z^Fa݌/Tm q @3CK_x}t9le|C['m?Óݮ8>O5xg-yJ003/ʧD8|۩Cl ͛uLW[H*>2Knڪk[yÃe}r%Tg=ԈhCWF#R:1Z) (tcF(#6"c'z'_Hg ʿPЎ!:z=E1n:($QRa&s^s ɠr)1)Iu t`к o7*q[ bNxVꭐ"oy c] )YuReNVv Cy8CKq݋v) 1UZj)q:~,KOLHwAp'& ׊u{kCQ{0rHRI,e0RS-FчpZ. qgglǝu椈>w3و@I I_E-$uqBo@ɧ%P4a\:08K|M92 3eH<Ä`ĉ Fu+¥1@CS&aXfǬQ`,Qʃ GRv,+c~$=sIƜKvFL|Ԙ`ҕ HP\kBArZS(")6 YBjZXőp.QƁ4K"2}AS!688nL;=kwtр4ac[gހ1}GG.rA5Y9Q[nl7.޸ A*JC2g.\ '4d1d %CIsKsP :@@SH ̓l oJ-4#0h6P`B<UX`aj3jX$"﵌FMFSOHKD:cgGG]Vu[b)tS8 'MfQ-gsš~f6t N: C\-&7V&Meӓ)04 T;+kwaty'ֱm4w(*wܨy=L|t{jqϾބ|8 `_.ͺu]15#\ZiOa_^޴NL2-jaWZ4 >00 .yT^LW/(~ eNU{,R *ӦcXGgAn1 cW8_0n~\nk7øԍOtarQp&Ί68Pf vN>AixYY\zS>(['U՛r{ ]rX^u([ya+tCfoٯAP\]u|8'.Mbxgg@|U$\LMQ/AQ+`sփػ޵q$e/I;R?_EV8kHbk߯zJ= KtuOUuW ;q5_CW0ޮ 'G~D$3k=c퓍5(k+G!ȳ3 6–:;u [tԱ.H :z4wd20Q$0R1%,p(ש[4A<~M/oVϺ;xSɥ$ou"8Z6W׎;gc%GiZkuͪXC+BTt@`[Pڴl fn}o%5?wK2mA-E ;|o=HaiIh]p\P}KFyNZY: jgБp]]+;-NA[I23! Lj0dы@`sB"tJq!Q:eB] `1Pcaa2w]9Gjm~ENigmo?"6(u&؛>OcCv_fЕgYI-͑{Uv>څN1{3hCh_Gzߊxޥ`2CȯL2*.])C$&E*[/\R9|w2>#ޞZoZ͘j:sc@5ix,S*GڊD90o.(~`6Fލl)Tt\`w4yQTֻHd1B'"3w@+rn/~e]2מ|XSIo)Z~EY>g-n3xI.ӒEgtCH5duɩI Ա4>TOww˽,e0'޻*w=fG?\^$T;Yz{8:) iFɢ| -=U㴞[/<:[Z.=џ{Α{rXJ-5\UT^UP~,7a߽(1IJ{$!U,py3Y_R ܒ3PSLp%'lh)~2pt;<#WϮ$rtȼȜDf,L|5- .ᴐϹ~X|TjHGcyWJD|jfBP+Wl0l/Wqp__?Ǜ_^|`^O7S2[#@Ak; w547ZZЪY..d77{*v{k#@R~<00z?:>)ǣOI:U0Ui"L8{DC.R%w SB,3Ņ_lܒY7GRJiFbb_,Z3$ɆF'S[I%+T!yEۅGmL{G#sW7m:0=Byo1e%-9dϙ\54g+ NmN;ɾK7O\ 6;.&el[ gwfIlyU"ʤK1 vU_45L17v08 ᐖx[ÕDM2h0.ZJjN1.|&c(ܗˌֳp^fc }=̽`m9|hup,6K{7z"K vjSTf}jwyԴoQx&u*,p.rHgMn}3@L(-] DFE$(u9sLA:SVYXf12Cp,%g29qCJ!O tAȹcyvozoϨ:y>LDׂ^ M$Dgĝ!D4L9Ufl @`ƉRM5I@v4z~WЋ#B !1B*rR05zW.x8nЄ,qec'gaOaGzllY _%zF"ѷ(ԫ]I%EͯVLOI|_7U k~ڂV^E^=ؖv+lt:^z2>(=3S/;<+ m{2+-ȷbA m!mL.r =s&TS Z pk- K,KHf29e#,+0hz ڴ` } 2](] j{Wބqhѥ/8o {Zq( 㴺63Rqיef1! }XS@sG 壆[:bBM⡹VoG%> =lV%tAl/84/Ig]'F_ٯ_ɘ$*o,LWy3zUӿ hp\]vEֺI聏ɟW<$*,z6IB.W%Z|ۓJ}I;z%\]ݏ`ǟX]݋\ɟٻ8n%+ڎK+`݌,w3qO!TؚCϒN;ld7HjOE>qk[:‘_;h''[ GȸWwM(s7%[C/bqK>!Α>ޞRo;wO˳󻧖oͫov oCǗf><ⓣ5u<ѓλlJ0ax⹅o7>;zQ=↸7WE(^WI[*QaG}*j [w233p@ϋ:}uy{f3 =.GK~Nǹd,IZ}:a}mͅ>TC@17I%8lPEV!f+dLt~IorPU,QFɖV1I;׵A3Z Ct%'hѵѦdSFx I5KcVaZ̠Țl\]J0'Z[D,=AWaxhT}BR@94jYЂ.9^ژLu hQJAħCIYg,4D2R 0 cYEa4Bva{U7*)$(@xhJ>Nr7YsզEKG:SAd*|,ťş!IՅܽOYgUA27-G>#ԬF[ ރF%k :>|X5b'ܭˑ1c##$63f Ȩ*ϡM(-ɄDK]ʯ|^EbT 9>tF 9yR 37g ,(R>Y hjA 1Ȏ ўF >B.聼0ir@,)MV΁|r`%P["@EE; b^h5#qIٰhD(Q]鍠oܕA[,X]pl5M9[ɺ(,qiW* HJƬG6lm"ڈAEF m {ˡ7fpydSXvaIHuPv++wN dΤfC2#Y6>N%G1jTPTߕziZ26HQc&#̂6B@F=5(!ȮdE7 +Z9q2F[@zIT8KPT4 p;Gը |u@x`1gC7a:?%:T]5qC%"}8>)9dT j,Wy(e3@fih!\=+AtޮŊ2YH)(J892r,XT5Y;!JPA/u: q `,m+_CPqZD5ڽ([ӽ6 (`3XO ;HH+.{ތD$J(k! e 1P =(aL! #A  -ڛ^,KT'f%l+ .*K;L'Q_ y7zXż#՝ mG5juzuiy&A?$nדL;O_]D]e TG<)}6C@FD4=R\ͷAyh"!݅5y]%@_!80.'MAu $$7(ڽ,hXzu,bE5#e$ˆыA(P`8$e#[qmlc:LNAi5eHTyP:P]y2 5\oFʰ*+ᠻFB A1-'jjUZ eJB=``eԡ;;kFg52TZsC/5{%v̂=orC6eu |4|Vsp / Q84e_Oop^OϫA2i\ϕI& , Bx@ c`3 = 5`.};ZiVSGjx5ǤͨyHY#h4v˨ 4-$/G[/*3&5j7LJȀ=aT9 ]Qn$}qN(]tR" ӝRP=`P낂a KHπOt4JyxHuXBշlb}%BĿt+AڵsJmh:Hg 9 )ч1TDyð*a+pYTQc$tGIՍ11h>T],~ xm,mT19cR=ڂom-+5^$Xf=Եj@רM5tvb2Վ **Z{Ok[r={Ϡhd?j7jD RMwU&Tڠ@J+e`j?&zFO^rvvy^;W!4ӹLtҁESv{r[AOwrUS?a.]۶ݛ=6~7;؞檈,DqErVo}N F@|X'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q`@w+r0q5N Z@օ}w1J tN kqH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 dM2jMN ^;Xϝ@ 86we$qa6iqH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 ]*/vC'z'VI@qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 ׭"n͛_Ny)z\Pݵvq~S>^TLiM% 6ǸίƸ^qQq K_ X0zEtŀY ]1ܠBW6k2+x;׊ ^]1\BWP&cYS1Ȁ}\ ]1ܰb*/tutEQdVDWdj]{bw48;A0](OA1{Jv䪅3QtD\6O?l.^Л'{hzRauG:;e4`޼9 ɶj|ٟ۹጗N1r,M9;Kve׋WqZeN5͟*}sNŧʼxwڭ\nj79M 2)\/Sl*"DAZ߼( yv;=Dco_t}"6tMNƋNB\6!ʯ5]8747<<`lIB&/"OI-Lt39ڔ2A&E^yA~܌2؟m1a{BWHW1gzEtg(bf5ڸt(;DJlv++`q5b6܅QjH}J?r[|AWOko:~\ץ ui(< {_*ypx//w}=\R&2N\KJ %˴-٢!(wrzl+ޅNW$XC2"F+Rk]"mq,whP0 Hn:mkj#WĕbZ+P*\gxɑpEj WW0g'+le:+uEjʣH-q\QƠȶtoĀØp#J(tfWاS#Lr5OӤEiMšt1mV,%\`j HmP%hqe5&'(9^v,\g> W6G 1㪃T.={+qI-D ڋ{\+q`ӋFpHp-㪙Z7 p%z\=F˄p3=HԶJz\uWB8H W(X3 HnohVqE*qA\I)pE$+ XWְqU쭫.JW$dpr9K&vEjWRW*QHW$ؚdpErIW -;RWĕfJמŷiAY 8 .0*fTy#ā/*#_6bJF]VZJnFRLǗяd=4jYS1Wt͙`bLB (3LhrJhC 2E}hYB F )=gFdpErKWVFT~YLqeN W(X%d]\ŐJ%z\uWNYSzL]:ʭo#TJ<~f[U#_=CCjVJ+e\WmzG`]5Frr;H0=:+yJ"5,\ڶ^HauE\I0ZpE-$+*Pa6v\Jn{\uWJ3)ŮHaZ PWyqE*qA\Y_T\`uEr+R [WRWĕ'm"|Xʑ ojzk<%'tbv(פDJ;I%1.bhcXJHJ(Ʌd`R@*c{k\g5\DO`D2"#R+ěUq6%g :\j'j"?bvplz5bhl$X5[U37Q4Si#[Jzls3 KeɅdpEjWWĕE`MW$7\ZTƖճJZKW$P{3m?l|RTU'qV6% W H4T>vE\ xBB \tpEjWRUqE;uīTҌ%+c]ߠޔƗY :qT@]R&*}DQ?ƦQ74mZ>TY5SK%(bBlyΌrA*R+\RlZ`h8mJuf#Ҧ+R /!Uqe<7$XdpEru2".EǨR2㪃rBR[Pf#\+R+T7+8a7\fU3m[WږV5Ricx p=i0,!\` \\S"v\JMJ2CF%OW$7ZWWĕBH3 JuEjWRq㪃\pEHW$@*"ǎ+T)b{2Yp6Gjdns6B3:aܻAϚEFBJ I-1M*k4JG`H'fGrLWֲqU. n OW(؊tpErU2N05T:㪃rS ; PY_"$\ZQR)'MGL_Hmo>j&W7Sk[J.Hb p{\=H0+M*"mŮt=:+!y=EB\dpErLWVqE*Uq%Z%+IW$TpEjWxJIm,KW(؊tA+qI-qE*qA\t؄p%䶽WڶJ6SdJ3kKЂdbn옱F%oPol Zՠ4L@'hT/"34T1`5'OcFk#3-ZYԬѾ#Ѩee%; 6kN&`L$r6R)8C  -sOW(X%d"VETB踋(*>lHWVY&D*B_t*tW)m"&$&g,̤W ́MoF voj"X_!Lm9T#ҴjRAd2 pez\=9huBBdpErvU,v\U*]JsF%+XW$WTpEjWWĕƁMW ұhGNƺBJqE@ LpEɅѠ *=P%X㪃# gȬr,\\X*B|RT:=:+@uV(mr4fO/տbn;+C>C>OPov0`r,XK?۠޴nW[M@r6^]u[Hh8C`90_"9\b3%Z;B?DxƞZB@mYzCY(/Eru^P{Oڦyԝ_վr[_W{w!<exjRT![>W!j3d_/pDl9>Ym̋:ooho[y.^~:W(R]pߗ%{zb5ݔ 2.۷> ee㻴7Q|,OYųMwcߊX;d[;no]}j_߼.frV?WP~ D_a>'5bCl}:?S8gC.!/<-Ӆ,:BtYrP,rkY#b[Jr_Pۑp=њ:Pq},|`,g^& D/j:ٮoOCm_`>_Mʀ`:9|> ϣф(h h0k𷿮f3 6m6}6<7?Y#5YSokwo[1ZmW^䭓7gh]}z_\i>]zO<"'<m5oM׵_^*.O.Ϧ7Cٖ3,sD&O~3f-j]݋ݱ:3<^}zokC|0QE(ƃͼ锼!;TFW咓3Dkh C׾6*W~/x|1>9ŒQ|W8[ЅEjhO˺A;|CS֕?šm _NXêJV.~K?kY4sr3hߞ`I\6zbx_НP+ x;=CY6|&L,Sbo=[[W0qVOy.&9rE{~sFWfMzrZ,r>qכSTN=~EYYj+jY>>~Ӓ!ϲbJh!BXI{i#"+zYUz.QT1=G9~+E)e  uY3)m54}Ƥ 8>AY ش܌"jN8 n[8צTsy?Qك^?0կo~xZngwP2u؊9_&r:x]ZղZ϶7wf)⿳$K:W_{w780W;HwflzyueN&*gRԫJ+u_].( }2'ȯr@> E='dzOO;Oo>7/??.WÛ_ː֊}$ק Op{E+ƏU4wUꐢᘵCuǬr_(C;ɷsk5|?y~x윶UO=a^.#ŏHb{,b1w'O2߷ؒl9,[v3iv5UXj',6ظꢗꯛRW ԛb qu4/$Vp$n=#FQ$ /OB`r!#med`T|tH^ ot.~8M^koC :ؕ2}v㤇1ҵd(Ⱦ<9( aDq8+%:Muy 54b HõKt8Á ۗ _2ǞhĈYWuӛ:ubObBbl۲ydd}58~zW^F3V+fw :kbcEm&~Eog=<JFqi(Ɂ)g'lZ?|\aذAXy{{z|=ws!kjpd72z(zޕ<$dO0FDk)zN%%xYOZFCTBZdӓ#E6d9QaDes2(/F_Hoaf8bc[(G[xT[x-Q}ŇOSO_6d2ur6_|;ISP2QE2k*!mhpY+w;YӖTAyvz d5hdTN#et\ Zj]J;L/҆9lvjucG="{}daysBe"$@f@.!JNXT>g|i[>Cfd(訲e[2D#h8zH|HE'Tףa3qީ]fg)W88"юqxAH kuNJv] TTPZT*PYё`MȾq֨+ ,RLH!f*SIkk%8-2vqޭkHf@.hGx/@$]%CgU*Q&/J[!ň!ʉУ]<] Nl5;~@#}?_#DD8mLmݧQ,'ُOhE-~F*ovF#nk g28LRs l39p[Z'9 Y^QQKTՇ:H {y.\RE,!]n@hm}mMGn[1:eDYОA?8md6P TAt Y)#B4AE r*ɒucX(J{JQd\B y m+BC0vX2ik) Gl1S)@I{VLY)ṭCYUFB'PagZBh(&#hY"!H l{x:9¿&ݗb(!fdh|XKՍPsgؖ{4[tj/Ax.*0{%}gH˿i<%0q"<苂s_fHb'P.d/N)^Tt|H/Z6%,SBkyT,USh/} lsJG4g?a8P<8sj;:}W|mo*ooIkWLTm)(^$@Ng:a :yUza^GMFc(1b/-{ui 'o=e&s9-fW?7b(6!CsNPab^&*j@iؑ(G`#H&mR34r)e -cJE*Kd(IPi{Uk!>*sYb+(LbuEb!XOZc=][#nYE|j ~N_y}03=tWKy| &2r!&;TڿQ_zl&}`"P4ΤP XtC0ly|Gk29.@Z!*TwK,AQ̵Mf&9-"҇E,1%6ePd!PZsQBʾdBi FzٓųYk Y𱻜M?~!Ocrry>^.OѺ|V lJ<;ޗ֦;*kKYd7[#zRvǠEǨr.;RNmlR""ԹC2 r4 /0R4l#)0-$a@ uYJn%0Tb\e@^mڋ,ٱwk1狫9#njz-fC"ۭuv x]u!--dQKgm֑0J%*`Hxg:[K#izFY%D"!:$b1^}X%dj'XaH.[pIo?7F'^ҳZꏅ6X@Г:nM[S/E-Fo<:6ء-CCz(jٝr:^jaīoti|߸*,a ElELA m1N\Iho:s21F%Sdɼ3ȎM`o`cpE{濼gO/nV|%ѺO6BՄO"Tz2dq6 Dֿ̩}#hGFuϺ].ӥ~N]*zX;FZWҡ`wuio>.6<)>5Zّh$Ƶ+%FW׷n|Rf89cѻӳ ry6?ݐM!p7F/zWK}i0n۳ŻUϒN7qNWK8t1F29żT45hЗO _A/+懚y8AV}qCqG, >j{}xywRO80=e{<\70Y>כ䞿߽ruv9MopO W<2+֗r6]Ltz}E%[yMwk1H=Ɠ/co?H񧋳ɟse s7[RߧOVE}Wދ_⾾osc25EB-^p=c*D7'p6Y[lp*[^N7~o&ܯ9TaUjՆ_KQ{/N|JR\Ϛ\tvzz(æ " ͨ jy_[}o^.?ze8y\e|{Ǹ_^_/]J?]Λh> }dbt8NOS-Sl1 %99,* hchwnv16{]`@EZծY3U{OθMzzfo&,Tcc]w+^$"iOAdEAٙ"T:yy)D H8-!'JȭyI.Gk>SM[uV/ Ǐ8ש;@msCܶ1#^}"j Hur7ݪ%W6aJG`FŮr奎L*!K⌆E%ЗZf2j.K6)o kDL1MtR'#%Gض DG2|2:e]@"Z2321.IƸ8q:qhH‹zemdM!)Z}6ŀ`L 9k8+} ISk c5=uv%/~RNVsRϊS&}l7f(_&tB]g,i01זr9HںF/A H-'sE'a}pz Kʃj,>L*֎ 14 4UL eD°LJ H^9fŏ'4*HZ+m#SZNm8FfnFiRCR{ZHJTQj`>]]U墂C~1wf! *FDЙՁMlKb^n md翜嘮;y:v?W%'OsH,O\#߿^J%տT_,aE-$%]ᵞ%q+Wă,fyћwo3F^F&^p4l5WF\Z/{Ng?_VlZB }mwi2=qmβߔ8[~{/sr׽<ѵ3Gp>.(d%<4z8Ʃt`R7> |QDWb83*my8~NpPv0Ϫ}=,MzLzlZEuFSiD N0K޸B:e$W6pk%[# eU6F)D;MMDaruPHY"!(OM<ċs}Cqb׬7I3l_rm{&:n}5Η[Rr4iF[1{fSfܺo颾gj.ݚ䊐EkgS 9%%|62]ܺjZLUs)k{Z}zM{KkrxMםݞڦIݶ"fs?O'+Hj7lU.f]nm,E:6 ԸELWk" ŹoW2Vc;Zz(ա-0|eP++N#Mb{GCGQЏc+$ !&%)2 Lh"&h ,qrP6Pq8#lwϿ]H߼2mҚ;r,t/bK`s*!% aR FXp\ ou(1oF'>{>OnCj6y$ äf6_]=P 8Z{ʊ*cFFCD墶 @ >aF1]t}{Y &2ј+R΂N'kF,ӂxmBRʼ$)IѵE<87X:@Ѣ4+[bTxm0M6zuNmT2?vq:Zj!Yَk&V}^%-*`H*j< S!D(OD{{$yAː˨"nM`$jP2X\z)%jTC5\d!\`?KnZP'X,霏#x8+V医*5 |z\]h>C%Q ]@)E/J]QѬ6,Aq}WlZ>_r1}w|R;YEQy?&J9YhK  @)-uD- EFU !9G(S}.B2NhQ$R*&XL[/fb#cW,X,+^ 5_exmsѱos߮:Mʎ#zXqL&CtjPX@Ҡ:6%I+,\ Y$,&K@R3A%va`"|L+]tV(f_PEajF|2֊+˨h)Ibhf P:B6-tT;-( CFːhϣB x"^ ITGema<,&vmgmHޏ`'Žk՘xWNTMU? H)Ug bԕ} 5Q92n>IϝIҕ dYZDe;5eP^0)H.:dDBg)$ŕJ$ qT$&Ų yM%eSX Q[%Dd?\!h啣p ɀh#$Jy璒lpQ@t[Bh )(ml'Ɣ&.A=e%]ڡ8wUw?0*H@Z(DiJ53%I!Z\:,PCqGG$^{iZ'\; A2c(|dA"BaCeǪaWR*U{:4@ @I%Pҹ?@ayfX{9@?BhaGqQ} X9$Rq]ʠ5W6h\rOvY®8Co n}s6WoWn~RvAL姌\s%qvkఱ^W@}.M6lo,]5BX)Ɔ.p9T_^;57?lϼݙt;;jl2o0zw6Fyza۫_p4MfKQOg/˳Qk\{|z J>TzJq7 Uy+tJ> ܸ87&;ɷWn+qnx;{OP*I+NhG`h#X.C60뒧>H 0 Qc^vkٷnI>I}ȴ\ T!yǕIxϣ&ITi`Dcjݻ{2*9 SVS&2xRZrFa.bI)W6?[~!^&SgG?9ZxGv7߭l6YKg^jeLT]3J!Ϋ=":.\^EQP=4'>K+ca0>Bπ< y&!1Hx88Crh b 1E<Sm&W˖0wGwNn-zJF~|6uQ[ߙuy`Ƌ>z29voꖿ_?t~M]}4K왗ݡ?^d?OLJݱ_߼~Lr暿$I+O^} ޾Ove tq?7>/__g6{?N.ܭp𓗄k$ ,;}至k1ZIig8j~FCf͹IuR ?~YLOOc'EsbdCޟY#YPۻME'w^k/\@k|f)WVܟ .צW]6|[hL$'W%@F/š_"%Oao]oR] dC,GթY4d>jyG͸ ]{F^WOqkZϧ<#- hݟǫTW[g{8&thߔc:Tg m9{%$j Jwf{^n7K7Ej+FtytPjޒ=}y3ln2onJd=6<#[y]|YYzή9mu3'R6O5iBPks^>U8;Kbccž #yDAX<\!JukICGJ`jЏNٷ&?%2j+TI҉zfL"CByC{W'jWZ1ދbB$"aٻ޶dW`i&ؙ`2p70i%$qo5)ሒ,Q(@lfU9U1kc 0 BZGs8n>$ Pc8 TzDP:eY!(DZ $#1(ڮJږ87?Pmyscp^50^||uZ4IY bxy*P hѻxK@}™TD!ETxSQ!.e{| _Hy1jck+k 2" -R5#f?bo{8N1U%p:N8E{H NIE)6ڲڹ$HHJ=.  4q`wAx'HGu:ZM&}D:iL݋%^52Iv~o Mn5NGtL4d!3ϫy)C$Bߣ B~UEձ3_8)X2D!A!MDH<*)q֖8Qo%P֧i4s鍻m12gv("v qem% qӕ-ຠTEޔX8XAt'uj_jjo9%jp@ QHg4Ĝߘ$=-KQGL Qp@>j2Υl $I+$A\zF14gZiS{ZguQf!jf>K*ߗ_]8NBOGLЇH٬Y`u_|m &yRhke!8 p{!@0hG T1zlW TA``yT$Hy 'hyHyԙ\&17oQT6_>ˇ]Z,V {7Y+DPRa;זY!Rro(@M912q H䌳rC`\XAzk,$#H2-՚1hT#lg/Y@2C55Uj똾s~Ŝ+򙘥 _X_Uh*zSʼn \(DȻ2 W>v_zKEeKtYFPfߊTO֩FA▅( UV#FQǙ@ib#6h>3ۀD9䓆IE \h8/ryz8Q$h20 Z&񛌦))TAWIIᙵH2H0ђWhK.-OK'iP, $*;/R'M4Z*Et~\bG<*2%O^ht1PaE2 $TEDPBcPGZ]=U꯭Esg+s6f>id`EsԘQE]X.iy$0D*mX Rjl40Dk gD R.f푣!kHM1KݶΥo59!/(F($sI6]\{F mWay@|;=7?kiBi-tG;&T.ه]7ز'":rpLqq$5}?|Q6aË{v];P)}. /{,!\,h+W bqJ ! ; 8ܟ0޺x\(8<8j=Axg#DX\zڛ;ʟSxg-QRa;?lc;x_zήCoˁdKس/O)A^8粄yW yԯu6Yѣ#ah8J#3Tg~,TWgޖ'^ݍ.^ς/sk?]ueշ}; AUr7*#__sob %i[rqKMͰf klfVYޣ!c`<7uGm^^=э2!ZmzVmv0;cm#aSo8~NSU+_c/Bgunuo^M՛甙W8{`}Z$E};|_vӂжꦹ4-|q. [t( 핟̭qݧW"{-\ȋ-YYU]yNbVi:?E)ORSBSB b̀@֤7ya'BpDrÛy)O;IxCR,Bx&0DϸtJ|I-d!2tIv8C^QZCr:1!dљ%IrHW `. N:;cRM``Ϝ&|湒_>MJ+yBKsL~Wf҃[F@oìx$^™5U;vAc늌 A)P&?>~gBϑ6J+̊blO,>}NsȋuʀfyC~xG;xsGA@+5QIE o4aSz@SdiSl/Q `s]u?t/KO3I_lС ׻KI6->>ZI + QH Ǚ-%L_d8n&cW ro+ Y`rBepDks6 ŕwAD/j8&z6is=LrӋoUjȮ?mEtkm#IB %%"/e3ț6M "ݶ1d$$T,Uɪ'V-' @2zuJM+9&"cDyۨ>|ahK&+~ ;(|;_(輖s&M/kl9n~/V_6.P#T㱔F#Bg™θ`11lZ5C#"(1%4dJ:@t/%![掅~T-քM Ey}1Xq  | e |EKU`fikYfi}iJ)%_;dٜ,|coI7^]aq 'U}},@pX ΣHADPJ,]n -|Rz K_k2v7-]L|sNuƟO[ANM %&C1ϕb>`% hlR""Ա>5bt&0VH$x XFXg&' %LB }G%?=ߧa|-E/fl6 N,j e`yՇdcdDhEuxfMQie ZE8zEV3XfyJɉT%WZ9]7틋ǜNiÐ(,[1@,Ouܟ+~=F/l9^/ZǶ9c܀8#+%lxp+QZߙ7J?Wֻb? Qk2.xUKfz?7l=ǛP#)i5%r2EM CBKm/s.y8ILr m)/%76wfu&;?}>Փ8x"?,X՘+H4#/T]@T4C?r>[HHb=BUoHGPk b-&4X^PG-m̬l(ub~gc㉼뛾1C(s*ý Q)h"sc,yXwEjo;ؖp8 ۀܸ"Qy 8lK>Ҷdmɲw]G]]hlAd! ('2> #J)2LY>L! giJdtv&!)srWEc^{{NS'y>5}^cкyZ[% UT}x*x^Ö͂06٤)30:6DS3 a1A>Ue>qϸ'ɲxwyAZ2j>ւu_Zd&[X#c`4+L&Fn DQ|"" |/J,%@;p`P%ƅ9)1Lsqz}5?+Bw^ꚴ0]4dCJ+BU7/;>caH RR9 e*IS{Qxj#{+0J}9/d/Ex +e)zMvf쯻/PuUPփ Nٺ !'ᔳxLRAuSJdfccXdW4zgCNJlT!1 ( f1Xt&ǚHWoZq֪`|pMt>.U)xUXD"4q (e:"5"49Eam9Euߟ@U|H/=oy1rsSȖȫA5(DH;'eg!I͈dK:WI)+ .lx~wkVAo׋˶HC:Nsh|U ~}ȓь}gos'OM V7mX7[QXca㶇O,[>0 g68pVj -^Ԟk蝪;R  l}5ŕJJ,JUJCqwX\IMc>a4uQj8Q|"Ƒ.V5#Ko>)԰Q޲E޲!:IVcN> A"&XoɛbSy6g`QN, HRRE$@^{e:^L-n~-iKaY쎛.ږwp= 9Ge5N{.‡_z6fK5KҐrh *]sMoۿ/fY͢uT5l$){vowA״ZjgS۴1he[Ľ 'b{DrdޱT_6=tIt~CzP6Nո5Ƹ  %*:Uve$7|fvYGD %Q7N/ZfYpX==u u =u ulGg$D#=,KV6jJ $%,8SABeNݲ @R=A==q:NVl|:J@SȬPx:URv5R4^zP$ Y1cuVfmKy2#6NU]{ Lo|]Jk]yn2g}ˢ2RK-Sj4 ā3Ӌ_;ͪ4d7#n엗>wݩ#YY-YXڲN_{=M1vv1w!h0$maV(H^Sa2H TBN:|$Ǜ-+=Awqjm¼ 94D[ 7m 3@i<9&dGM00[v9"LrgK=ɺ_`<)K*yVC`H"eZRbL%zY9, RҪnST9`XZ"dOoҷ/3dwjح『^x[Ĭ;hv:;` A=k.guMs%ћF9 E?:s9u˜,eN8R3g`I$#^fJʠ=D4)jO9 },DAz\,*AYc.]gfv&}+ԧyZk#|];iSW} xmB um-w]>_w"LDX"l@Kj.J< zG4/桭joٶŋB\qG'Py+kNur/2Wyy],EvkXvR.<xI 8V;F{bPni[ΰmhb=^X])c(J/lVۨDcqa; qS]xytV矓]EjBk}GqZUF_'5V#y66/KlIF*m'P\Ue^SXc??0԰<3ֿ4sB&O{%jgZ+t ԠPRH-ٵ8;FkMK8~9cߴaY% ) w7kknF/pvl\qqUrIv_6I WkTD:oc8(YC$(**K LVٌiB]49WfSS$ݶ7T?ϣk*nhf[^-Wae kЮvHg!ݝ)ծп{zx\ѯ5տ~EٗHUjrƓ|9~[iv킄m 7YMe|'{FwcDo7ڈ1M,XNrgw}M3W6d_}cFݔZ1&?HQ ~_tBFŊ{U&~[46,>-ǿ_~,?Ow\wo_~N S(}"Xv:Lu_tosxe.]cQwK{~TE hauHGIa< uI jrwj*IWHlWbt(tMR}]*qPpUlU!b?C/5Dz86bpq8*NB$`!ZG&m~qLd9(Ɋ}&G\\(Mxx0<धpX=9Km' 6o=b:s#wٗ:BU)TgG_Fj}73s'ȖCQ@z-IA{өH{2a$$NC -::u]6L[Qb-#/ɉCOD +kL]87jzr~YnOQ\>`uCPVvKn@66n'T׷伽k>R۝ΖK񗟟 !^UCʄINhKJw%8`Mp3iCp fc=B9=+11K.H`A"M&JsII֌ٮ*ta.ʺP p%Qc{/>l)RD>BDŽ/hrlA}ItT`Zc_+kD=hA#nZɬQ:=D%7N8c'J: 'fymZ"z.U55v Ar2j}0TR\dB $MR:!J5b5qkۋDړ^5:jdO^4^FK`)L:AF&{s(˒," Ndx x2vbձ>= ꏠV5<܉Lm݆Q, я%a8ֳ.uMIt4ii~>nktV2Y0.XLl1ʳ^ZgQ%<ǔU0hsB=n6O8bc~zg%Kάh #YT"hZyVu73 epsƺrN2hs%:ˢϦv@o5qv*YȼTh!'Z- 'g[C3,19GC cHf9f! ϰiN; % &(y<JXIY&Æ9'"U+WRuM[zz3?+j ^Kf(sg繘C.gWE'_Us*ZkRR:ey,W6`uJ #kR RdQyWg&^Y]]Ϊ{5 ׽x-e?o>h޺|;᜺UeO}̲ Q<ţku*9*#:='{2w[id泜竁u4t]nNSZzܸ\C׳*X%(u&XD+18!y AgmC~^&!Dðɸ䞂 jkT{!}::/u;$ʜV#O)s):3сk0Ȃq+TcL,+U;he`Y B:Kn1A҃E]28.EE\9oIԒ6&sܾ݇Wx7M>LEqGhZ/j&eeܕG~ŖZ;FSz&j{rN;c>!^3v$^ԿlÝf{kbDs,x򤕤vQ:8*U/۪sVMܝ'tNoyj mŔ-MwM+#q'Ff!͵+KA\lUz񗧛ӛ̶4zfoͶAz'%vov~Swhyf:dwceTY#?Qo5]-ʟ9(yJWO~x*2-r`s,Z,aEZO%HF>};,a[ar8wIj<4p9 04jtj1)Aw^4,O`z UO:7A=Ynr!DMQn3/mLV[ȃ-L@orL -<,6 XHtAZ"ˈD \8 `{n1! 5/Lڼ9.}N)7alC8fF+z?ČRot=Ra}ʅ ::'mhr`vXCPQ0f`)uO0c 'PlƜLJ9% H&J ֛P( a:qo"H&խ :У='j_-c3t?&}_,ىwp;eP!椥ぎ7ZV䁶o.pR4 FIf4!&L,wV%˩[KI3fO$ LNk-%m*2#\\M;Q;ǖWd{!kS8bՏ+dL[Y._٫,Bg+ɂժ20&4Af؏؉,vbE(h%SAY{H*ka/mNĄY W#O?3{) H6rɲg&ڟ] )OxԚą`Q V3^JL*&Vq -Pl&ΎoMO^di"ɘ04/Rݲ-bCfCh{mrEU@5 Mr,I%  (zQTb#jD$nstSkk^r%O@ JBk96BEkknFݭ e\hUyH&UɩTR*qCH9I忟"ZҐ F#y`pЃn| 8Ƃ F$ɑ $-Up=T"=MXUwKXS >e9-Xz,9]*8 maJX1*A(=MR=? ֱv(p$"2bcX)q,9g`G%5c n[s(B,CLι %EO(1V*y BvH0N_L4#Јė>Wq\դ؊'4ߚǔe"M2k^~ AEf..??u54>LMa#O28ܪ U> Gܐ1P2Lp%G\#7s?>I .5$r޹Ĭ C'JYxM 2 ''d޴x:O]Lq[zK\F,WV6u~j~~Aay^rpsESߣYaGL^^qF[Hxbxq7O+6-x #jM̹{?lެ톃XmռITys&XLʴۦ:&%Ai2mt`YŲO'Bn`묂mum=+ 7K:+2"fztʃ ~ŏ`8MrSM<񿋛&˃08#v߿=߿˓}'_~ s7'ov S(`m$X$q xS㵦VSKLj>r|ygU[x/'Y( L/JcsQ[nvMGtu= Mb~H4ayo6G Q!0@[&%q^p?{j#1n'I}v`X#/Q'-LdIS\\N`񆩜so'6]r:R2o[!O84tHyHL:aBƳٖy&{fs*:4HmMO|k`{{6ϨX͙=rcӝ[yn=tV;O ݙ-._]ģXh`0%CH +ءa*xC;2(.HNDUlL9fyU:es=ثij֦}Ow<xLơK}1%r$V)4+Vhϊƚ*-b>u)_ Xt$1SuedZsr&F *`Ԯ^ HW >ua+Vje!'Pؒȭ!s嘜#d CeZȳ'nH,Y0A;;\o ja&&B+$`ÜxՊULaSbz:1H/wc3w6L{"ǜVGb; [7^<7{^%; 85aM'~8}7n@aiޑM~TzP;?,isC C}Lwz^h4#ͻ5 &yrvD+s\}烋|׫~~ѤT\~]~"U73ݏ7Tu2w$빻Eϗ.Q~xH$_;s^1_qbGLhjIǃFj`#]~r1&jC_5g/%>ip'XNUxv>(Az+*=n>p1-7v lm羜eSum fվbQdy%\xBB4 T^0yXŘ%};w%Mbsqa;{ϓf`'s`;󰒧q,oڲvA(hZa bn ՎëVԓز03Vg&aimnhim}w6c+[rIZ&^rZh9j7{HBXߌwyǢg 61Lb̎Vr杺 ӺQ×J+Yj+%jƳXy}IPYƀhed֣A$sYs&AH ;hsVϣzֱbt*TU8lOfn\.jdZVV:p%!D$ǐHI7z@)G{WE t8K󫋮=Way[v^W=$[R Nj4ɆF({R解Unһ8mC`&s5AX$B|dd޵d١^,XƬ*0IĔ`p.Ek Y$asウzU7f'l䦔 w}*M:s/}TMHK[%XZ}jO|_@ b6C'6},D/^\z}|>FpA:~`m7ALWrs:^u >\Ukw2N yht!h0 H[13 0 .u.՞1yxwn< w&eEBMScKi#F0FC I*L ,耳P4g=X"S2)h9=3 S2F t(\w3+{&7"P %@O%̦W!D..>EX[j@򺗷ޫ>Df SDSd~p%yX8h|iIG"p!8SS$ި*+L  D(A2TIG"BKʲR= ܅ ]K a5,u2m傓nBhG:]6i /+چPW1+5鐘Sh'W ezP}xj.&a/q58o~zVY 좖L 685q+/ԫeGJыva ]@HW\vżc/8fbVs \*aV07Pu\\=0`fg~3|Oū_~j /'*"^co~F&'hcq_㹨'Iz~uWRib݋;!'w-?Mfػ3-dRbJ{fvg6^_ڂies3A aq<soGx4`ŌA"HYpOΒJ 8I_<3UqYuakS?>tIE>oҖZQ`N.ՄZ Fν1Bƛx`KM4zɊ!W*;&k/wJ`ٓ$.O wg WBc0}zchfR%Gi]"ɪvUnz T[JaJ˙(1""vkr ;ay2|XXh4?wo~ߟ Әuw~zh'8*_K׶{ЭoEY8j~ ]o.9f0Kfj3uKQ Wǟ\ />OԻIdo.d7K`~1E+TZ Ƽ(VG;)+x51Y?1h  wH7-.ekWwbm e҃e'y?WӁ, >L^mUR[2EeJ2;šFdo&ulnUcmNE^9Ixv' A{6ݾ%\Ϧ۳l=nϦ۳l}D/"j+ ̾&\y guI:diKq 'IA~2hTis Ģ!`D+͉"tD谋|l#,sM @F$㑅H>HԔ[!a$EDH\ggÌӉ5 rPhZ_K.a:*%G#]LȓuI Fn##Ǔz11`$bF3 R$S QR㸷lVɊ}m]6qH}Bm~a`-/dۗe5vqJs :҄`T3B8*⨥x10=Ċ%eт{͑ȇw51ύ`HD11\C|-rwHHxp' 7%Xtdi?/pzZxh_#>ip^P}?'t 3s̏9"rOn<<՛ٟ/(8UT31wut/oЋOoR|7 A!w~>7ITML*y̰m">w@|Eb}M:Z[cDu1`'J9Wt6Xjqk&=9|꽗s7KRsNN)Io&ErfZSQ1ךy@!kΪ6Yj맍癶i'b*:IhLۥ|Bfg㵫"lgDHx[%fo#^4sq[$\iw+%+$kv;9 )qn=!h1`[@ytLU7]Zؓ8zuNysa2{TCבrb4Vfb*,B\sFvmqiWak^s')du^@mXbkHuW{f1~3m`jD5nithh6ްֺw|i6 c_u)~3P-,Ԥ_ʸ2!ڈѨ5X't֘2J kb[l/xx&tˆE,d2QNN§N eĞλ}f;NN< `31AJv*e#g{8We~T =u1~JǡA٨ejpDHPڡ('QcN3OthU6M"Pe ~Ql1e#T7-@y.R.PB@3$YDN1NHF@}W[Gk+;W8:w  ^g=?/ʩ !Ar2^&(ΕEV'4J@fVDl{Ϩ*=KɽN- 0m7%h kM'6DB-M-R`$сwI`*DfvK 73բJꟶP)tW'b@y/7CBq+=߽/{ f㳧%2o,J̼ٸl2'x6=˖#gڌ挳E=znzVڙT/cRq"PnR]]_7?%^hLHTypjПKrYX~sg -~f_& AT;d>:lB؜*%G]c%%/'Cuh $_ڠ} R3E#H:a```?6pz+FȔ60!|+1QȕCy֭֭{=?iUnVf@_]ƖɈ`(o6pRgC%(* 5F B E,Lk݇jkηqWTU}賫ʬ-v}!$s4S7mQI)+ 'T!m5]eH仂3G$T}r,bQ*6亼\u8wEٷ迌q`iĎvQl|vT%A0LJbޜ.!)&ep[Śy9h`jP>n[ev $m$jA)2lWVDT5""J-ZDhC4?7C&hsgM"7 [͚˴ ^ऐ2o%))!Iv[*o-Y`i9Y++P?[qXڶԚehЌ cgu-%> =͋yZKUBh"ݶ+N1uaQQ]g1(~os~3Zģ8: /~mGq3rp}f-/nHیJ5W>NP.*di[RQVrVwwșى2ֶwSc ?zAhGlcE:=y,5NNv[Aسy-~?'Nzy}74/Z ?[8ô5*mQPnKF]ȍx1 >^Sߏdk`g;\U[yQYRCA,\ȳ@HeS:L`pSQwۨDS:C%(at\`Rp 3BDN HOx8jK4RVn^7% k"D[q+*9c$P|S'#n,'Z!!ݦPPvDcFxI5HҞ-U1%(ThFضE*_iYٵԪŘ.>Wt%~Jv:eYmlﮑxdI %$ e!,Q >7>/wwQٳLȉj_P3iy}wIܲV6hIg&hYT*Y#DO!\XKt^3mڵ7M˵uugo;O׸-Z*٘bM3B|^G;߿λweInn)Ą;6r@3Dk5gD)hNe1+ݶΥo؅0D=PJ5HiߟCrQqtN0bQI/c1p3?fP~ R)PEC wݻ )g[I1r\bs~, /oa\Q=t54W*KB y Eq{-g`j)C%v<$gJrN?qr_TWҔkͿZrVQ w6#P(k]ObɧT~Ob(lя7EZ mH+k>7᝔2W|&!/aF 3(5縅)UqOŝ9M.۳{ T.|=9~p2o=$sŌ+Ozm]9 ZI}rnŽw@Ȳ^:#]--\8<|DA ?t`^ż_ߎٻMVMN2zfF]6W9.8;c##y`G_Q"{cZ z,?T˅ ϑOOZ8wVbG.Il8IWxMR`,\Ihird ̧\zD:kCJzON5 x<}'$ U`K)dgG} *YJ8*dL2N\sq\7dߥO5w| dmO:Oٜ37MNwwDIYIuQehdG `vF6Uә`%FMgr:5-A]Mg*)+T 4lO]!QW\)E]ej\7WWH%'zJ : uE|0 ɥBj}WWH SWP]JRWH "`2|`Hu֫'f;+5/{4j5/H%ۯ5{;uWHy@ f52]]e*%t5+%a=aUyi@g1qfKX|: )sv.g_Xq %PK'@2B{فo1k"ɿG9Us[s8W^**tPF(">B](=M{h2ʧȂ(\<P}`*Wy-Eo _CAU?ƉqpRdd? ÿZ_)Żsn9H`$'U̘oCǿ{تXC^aуew7~/x!6`=#伿l?~SM&g\0} i8Q12U$4"=ds]AﮠwW{O zkSQ(B&xPxcJ;)StBJiHZ$M]˗VZ.hop6VsBۊ2[e$"*#[]Њ!ĪOJz( VpH 0tͽܫڔ_|\bP7}m[]jxѫچNt7?&%U8’DPHhCL19' " "Xޅ1[G=plIٛ(b% SI9|O@|ʼۆDGRѷB5%ӻ!MjUBVtsK-DŽӱ5{/w` <9{^ֶ΁2BE]ͥ&ULcg5&$:oP"Wsqy.czސ'PoZ;yǾ kZ@YdD˜̄<{eb1x5>ҟ:b$CI82.%mΙ4V<*kKϖGy$&YS}*8 2tȨgi9l .39@|$?ve,0|ӽo O'~x~=Be{pθO}v?v+fmRv&ev.7,lԾޭ|3)8Yem|?돗Â?.مfA.ꊱthdw_^~=?7:/OMxtEGt)G-1{]iדU{>iQ(Ond~u:'J$}C/pUpKi*g:3M SVp<,=_Z=G4n[<݁zxey?7iw郳i<ǬA޽ ;yw)vՇy{pyi݅lE]u8oOKG YoÄכ[ }&|ѵd}ώySa3 Kghv7 5a=HO%C8N)Iј/"⎅sKT!=Q/{XfZf0_ʠ 9STe]R!DCO>e/$Bk&md`ca2`Y@oD_VGulTﶚql1#/tæHFv6,s;->]j}uqy*sv<<%ޑ=mV @vofgmb?>{LϘ;ݴ.j7id4' x=R 3Fɰ<"_Tɇ;VwL^<< ,jel]}r`0gD[s#c&@/U<櫌uƒ)zSFJ YYlT+s߼#v~yGEk_Nz1P~6$BxY5&k*$0%S@>ROc![Cݝzs-)",7oj.…EG)hF):PVK=E_o1vmx`zQk)5)imjMZC:6\*FɗluxSW+n%⤵>==7T,|L}jUC6&Z%i-p{.ԄX|w.ZD38f3|㋺k#KJjETS?DE0S&|oURlkB@\1Tp$xr*8qdS )2mhdUR{5$"95~g#@^7M{gp&?ĢixҮ&u.EBR(kl&vqs;Jm2tJ) 3 DV>1zE)| f5]ɱek;byn9 QQ!`@6CZh0&$ vzJJs>X 9YKM u 4dE$l>$g$s]O55\,(2Qꤱ&mYklw(! ڥYb;v4HB,&)USr H9 j3b{u`QnnG-S"7uV4 %dC mc]^R2؀˾`d%Ҽ1Q kP,ڍ!Վ]U0(* T}Ӝv [9hul3puTK@,i!ec2 8*dBCh#JyNcV#f E2J}3VlLoZ2+(_gh0lNPL&A }ЮEb+6AwqRQ?1d avNc،Qc8kMù\45hl[1 l[YcD3ml"2m8J  ڛ*4D\ QQ&A($ճd;#JQh ΎBR M·8%ÓGTge] % Pۦu@T0+K6[k R![1 p X-@.f@Id=aPBBZ F t&01ng TnjYb@RnvH'k! }@za7Vzq}.W;gZ ްcUd=2?u=霁Z$ >dE@X8piL-pd2}JSx֡*3h^̪s  QL%_)P=tP P0 lwU"[I@W1fpNJ@`-Ho@WP!֠)ci6aV `Jo% <; 'C 2Ds>Mj:LF͘(-Q2@~ j@7bpUg xV\l7b  68+dA  ՓTz0k 鬞;4bT#yg Pf*x mT+uƫjpdPf -l桾{ŷyvT 0̌CFiAb66a|Oy\h]^ky8\ABK@3.k a`$0l3Mh e,xVQ7޵ g ĹX.  ޮm2f =xjz4B|YiFr~ͭ|_O&1E\PւOCVαK1k{).5(?5WX7cf~S6odѼoۛ@ejs=?x)Ҙt#1@*H:b3;6$`|N& ܭaS"_ IQɦDZmtWWUWJ}c7at+Fj̝VF+Ab@F1,A#YD eA %/ W DKӥV7?.Bl+X0J`D'As_KVg6/,l̲͍vZ#cFܫ烗a-O`RJFi5&1M~ Qrɛv¯w7˽,8Pgo\kbj|d1";nVE 17!g|PD``B;: !2x!4ӧ%˻)k1Ed¼dZ ݾ- A(->->->->->->->->->->->->->->->->->->->->->->->}Sc>&ҁЂ! ѐR:$Ϫ^#@l줃>m~532芡 +eh(IE) CIC#Mob{"Mo%K~̃\^#Qf\z: o~pR_ZI6rhほ 5䷭t.|+8~tv՜МY]խY_άj[7}r_|lp3W`zϼIk{3Z|uͅ]aӧ^AwNݔTBɘ9 eϳ,^/Tun#2e] i% {hxVOllX?%3V)ߧ0n߷tJ.G:5,Wϡ'?oMwo{̭BrӾԂ[.6{rc eLyM>jb ws1jbFd o35ɑwAlV%B >W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|W@|4zI>uO % ӄ]~w2AqDa/ca/дRj {(S#?t6b?vS*QbRc`ؠLeS]P*MAOA1<9ڴH ͍vdATpy!D nHT %8rx+8 #k3QV`Ԙ ll^i=SHvƭ ԩUg&ԕtYI30 %۫I[nKYˀ(!YGk|DRcd4e{A*@*FrJ0t*)F.F꫹DM_ }?ta@";2<1fEUofLY҅.۝a*n;.sE0|mԴ\ߙyizIȵ>F涽޿nJkQ}w~ܝ1񷽼ܝ{C-pƖZFS+bRIE9to&fTn4UMXR2:'=9)8؇{}i~u1CͶhm0SL=qez.aT^_d9-*nj5T(O+LnN<ׂxEl sou{FjAG=;C"טOgUQ3aZϸM !-([iWcjs$CXC{_|Oy!ӇKP-mY`k߽) o*TP%Tx_i$,e1 Ňw4чV-]e%hkCFQ$ SmKV1.[E$){G{GS}Y7z8LxîOe-kIbWzrќM nǹLG+y `*wњh(X7b988+e J) "*8~/i- &%cюWBkYhOQ;+Yg)4$gv֔z*(rZ*}*1RsJQ hE>,t^VˢgH,.WVՑ'&᮱0㠕&H l)a`@*1h,:.lĺ;h6fgCoM^D4Gi >C7mp廣xݐK'k78?my-haҙ_rWN* .r srEjusɹ\%^3^|rp_;O!ʖGyTʕ> I{R32`)"'Ctr3+ۻ;O@ ^fhK[l΄IoM u5S/ Ici:|tᶎP^ԊdJвog-؟NëimtrveV\m0|oJ=O4=Ӝkg9>zr1qTJ5sa̗n?MR}߶ j'u ӟ7n/2U_~ې(zAS+|z|Uwl+A:{ȶ=T78l4OF.]z +vR$ΠYxE]qQ}b~s o??}}1Q?Y/RiE,¯ vɭOߚ!p٭bܚ|ior_[ޫڡZ J?u~>{?z#-Ϛ k;ei+ 6 bwT] XsܕUb;C4釴 R8VHk $W?W'1ymR0 vZ&,RnDNGe1'=wazyHA9Lϐ2:9U`J\[ bL{ X$Q먑-Re]'˺8L;v֊կ5db hH:_39-']Vw ,mjp M#<}T :0&ch+99Y?P.g ( Ohfo)Gwux%['>97 \Ǘ[s g]{Ŗ?AVAf FS5)YxΑl`ÀH7h0{jFd>ZR(#TOpiO飶ěUJ/cL%[3fgfgdӅzƶeօ½¹D&2>].Rs[5ioNL7uǓo\c{qDDC+Ƽ,H(.4 QW:(ŐVET)4Cr 8$M*,-NsMCLvuRFTgƶ_p1EkgjmYk󢵋g+f]`m& RI83EHhabȅ4*x`b^A9AQNR.&2G:UxdևYk>3>$x0>}F5,hU=^{&` [)XbV\W1 'MR-Q+eͱBb%#1S, A94H\!F&zx{U[Ћj:)֙lʬUыE/S<i#L150*(UGB@(R4S68Ƽ}Ńsma%p*lkƓpUϑ&~ fuE,[W_&!}c ϩ1?7P)R#@IN+뷟M%K}3|?^Gh.|SvX!wx%e\NjM}]帩ޟvᅌ0qQK&]:688́A_/|4 >H!:*-S&LpF7w3" fZ7Lsu~n>6b:ϦG~5;i0|gr={7NØn<j>."O˧'荝h.p4*oHaz&OGm29ru[@X2,msm_!f .>,3 &|q0觭5M*"eMj٢Òhu]w%aS]Uu]K};.rqQ΍Ig-XFUZѐxZU·TM[N9׊FLʛ6!w6;.P7pFW5EW5*Y5XY9Wel)NZAL=To-UUM·Z2k_{_a7.[N[ARn%H0XtxwQCh̺mmV]Pʖjren ҤVKQ(J[|umeF;Fj}`KQqذE!k,AI{]iLr^tglNV@Rۥ3;@ #H]OU(^,_T}n/,a/:S)z:b*jS)B(ںZcގqؼ\/̮8' a#ߜfH*7 ]göF8ЀOn.kDh(%*C+]ag Y<خzNI}?s|Lng}_6Ŋf9q|O&w߾?ۼijz< 6gxw|oQNoPھ/nL`ӄīE }5۶{ڮua[ZôZ*F+6:6+69.+66Wl * +e}ϱޡ2g(Owt'_ܩy6$ v"ܨ9oӏ_9Zmx7yx(]fgI+nkF@|l4M2p4jҸ1j1rכ6hOuկC)3QۦuDZ Z6Din6Wϸ `]>+&㑏JLoM j$d }[?Ժ +""a~0nH/N3` Z@c_]\^vC7U`/UϴV?vPӳo:jlQE+9xzcZWz\*!@.(0;Ѵ{ڴNu%´4ekt\.\~@.xg3҂0QMnʩ W(Ӛj1i1sȵL3P%VFN480M08`8pl؈+8tVԻDtbJiD{}u >VB"JmF~k ]\tE6 ty0R* XкUh@8(U]AC ltE|tEFg]PWOYyxmp%]m"J[WcԕR HWl](]!'<^WD)Uui슀f+µ6|y(Mue0{>A}tER+Ԑu5B]YR$@5SS%&Z̅9FȃoF[#M[pl4MhhK]DL5gm8 QlZHkL]WD YWcԕAHW`ÌSpU+5w7j< X+ ltEJp~<QfY/gB;iQA#W jwhF42֕Е̺zhփЎy6"\o i姞}Tu%銀-:EWDm"ʠF+eL`ylcp-]K^WDMuFW CU>u]ɝ1XJ1FWk+(Ⱥ,Hu5?L!t sg!sp5LŐzDi^䙸8"*];ň:Ȉ#r2|d6хr6ȜDms9(< 4`,!\f$i%ԇRU|cZp)i$0ltEhu"J YW#ԕwBNXFWk]m ) 䉛1*84U|)!$].g&Z|({1ͳJj&q0UzYL8Z7PcWk*RueAP{rPPsR@-;jD&S&ܡWjlZg+6|k5]O^WDrx{i[>׳aSQϺltz=ul~( ~#v({%++uЬz71[FW;ZC(Ag]PWRx+V`p+C*GmutE/uEWDLR̺4h-#]!|=w`Q cWcԕ+~弧ZEWD]24ْǮƨ+ LJnu,D*1sAI|Z8<6&ݦ|3 Qrl#{+CDؐK,QE6XH/VL9 s9e'*g9g -:6D듟g&ʐ'n8$v4#]!e+ l晑ցO]WD1K#b+ƶ&]\tEV+Lmˬ'UUAxFW+Ҫ4Rd]QW>޼ePWV_T$x,+>쾭hǍ19y5Ḡ:eC;t80}eDcD4Ln=4AYD؃d+UltE֥+!j*hI]WD)uuO豺B3HCouGCRZ ᲮF+-HW]!EWD+E"JǮƨ+}H`` 6"FWH 2v4"jL.ARt*iZςsCCGH,`iED#ʬQjQ>B"Jƨ+op ]#Ͱ28 Xᚁߑhԩ\\CA1;c q0\tEJ+L핮'ѕ%'] ltE6uEZg]PWت`銀- :EWD;Tp(J#rj2 tEA"Z2WDWtV"\FWDkl"^5jL2׷JU!'`JεIpKWQO,QkkMmH~)V5݉`#]ypB#ܡ#JR)1*@ZW 6Dsѕ?2LHJF[3tg0 K?h[CH%++uЬiM S qFWDl"Ju5F]i~XFW]!RCuA0?Sp%]P#)CuS늀-:6ATxteK]83E >wǨ+[ٱcy~c9c A0Su0rUL>Z*z@#S@k ̷g7~h^c7+}t{?/+|?~BM^o-! }y{\?C^s{D=VeCbWw;+es>?^]7ߩ|gאwl¸/ho~gT]]LWsCBLiLhç*6U6M[-k]t͞5l_wĺ56u37rFK }뗅4x[ЮFC)uڴiU]PZ}j%j)j(ZR;0 ʗm+e+ʢCw_^d|bz[sA}#n=h~Sٕ߷oV_>ؼܟ{_8,7*jv'7t;[.>#M7\Q=os5¶n/TMQU7w@S< EӠiq7m0unzdtim(MS4 oE-ƻBnPiE 5Sx9% V$%)$֑\ F7ud[h(*yJ|2?VziVgPvڔؐ??X{}57-Mf]u旫c n `Qov|තr[&߮?pwPuvTߟ퇏`f)>'ϖXÖb˜*̑OvX,)_V&]2g7Cb®le}!ܜb>sOr\uAaVxz5}ah_GSǿk67.v)]n&]74MA ?g{HW!On΋. $1~TK)RKR++|K2_&ig]SUgSw* x$'v*aqVaLiɰbB⮶\{;ֶ5gwՂ5~̞Θ[wם~&prQ<|X娱0Tmw벛񨝁Er)?-mU/L%7mtrNQlrN| N0-Sdg2PF,7J5%Q T68KrMr|tx6>ӀжAʐJ0ӗ6~A%A'mC%ոDNΘ*Xy0 4#AZ4K,m&2e{?Yx8M"7RD9JvpY3☑$R8B"ECi@sJ>K,h`SFEL1Vz#0/J+Fe(6aƝBc) 83I &~oR1d}vhآAG(%3SnQkӂ}@;o{ /A?{+~r ɃU~vtn dSY06wQn;Ap Z{Y5;c/oc:B=HB28QNHU\ȝSJ:6sA 1P H༷T,NH]rOiTag[1rrO"AgCV! ۇ쐫=wH|t}gip=pSebcwŘJN COv܎1qV.D%m4an]BΣ~43$1j*]绫rGݳ1YwVxg5]a΅k&9z$xI~[d4)J;'+' K[t:&=Ê*EBkjOhp‰c$֛[H&@*e2n_y7gNw/^͝o'D|nz)ܚ+w7n٪lyw-)hhc-[xr,8u81-LQ wV8u΀gG L`&Z3& a+,,KqWsҊ6k]:rZmW YZ[S˲6_&,w$X*}D*T‚&kie#ȑω>+kGZVES+< ;~&XɜPd`e(9 AlY'ʧT|] HG{053ȅƅGf<T&ΣFS$°9[~3mzDxM^gJh "|S LN.oWm|5 ҕш7ِۘj942H4pM4$4* |Qj)vV/:~0;Zɶ=rSGmb`ZZ6LM N$c FQ" 2Kjsp(Ry  ФNLm!k-94u> ὃlk4D2DLi/voM[#!T"pR4"$1ri$a gD(R.7H+HeiUZ)pB--LR#J/}=h#n+m50)5lޡ:434йg[\3!Pt#M|?ޅ, [?ϳaG箆R)_S by']g"5h d aPN-9ҊsÙFcT7gгPrV S;*/u rr-Ռ p9iyLo>V!v\GxܺZLV Wv7:xVyWks,luN9٣YfGowW\99GG4ܞkkn|=b}N 17Szݫfm킃Yݸ.TvFj&iV$nVMVOtEfy?u`^żBsvougYY?!WͺYb%ϟɞvM3=5|͠ y/`~sȎ/o/ï.~2sy߽]cl M/ ೩ǭN͍fjY' ͼ#%UEmvlo>ﺣS}*:ꑃ-]n]Y]c'Nd4Q'L"UP\MUA XUI}B(\ 裴6$vP;IoN$!Z Hka"C8K,| 6oL)vs;*_taz4vc$ "]-,I5N"*Р M&[i:E:#ɾ+ P'֞gtds&;nlDw#^[8X=d!RDZqTDrH+st벮oD[B HgOϒ"8hQH49AkML(,,p5[#,C^Ov9_~ZI~/h`)mn]d-Lhm(!# CGjFet1yOʘ8ez `zDQi*(%TR5c1r6krX.,BQXVT(6r[>zIP^op_Ǝ(qL%!&@& $<e(ނ46QG)ʞcx shr$6A 4f6 YN&,RD6O)Ef`b.jmYXkVkv55‡d׎QD 6ӊ@Q"TJ(\*[š.x#@ !hEuMfB q>&I@:7+YF}N=v#dx46})U#QՈ8^ykHrjs+A' KͶ\)Z 6)MyԂg|P`B@KMK4P:ѷ9[$^jzqTM|ŸdG(HahZŵH@#N &DCQ<0֚M!B"$qDDVOV/B/ N,±QlPa●5z#jaS>ĩNhlڜf?PdFͳWP?\żlr fC/ØZo?UWMW|˛W?']¥G?`1vp<~9g a/rϸ?^*Kb/*Y8o (5ep,PWrVÁGA_/|I ~PeT\2 8pƌYٲqZÀSպaut'W~t鋚^4z3G= :?iٻk!$T{il[~}An< 3Ro߻j׫?{F鄷b;sp{;i xM4Q$$2[lIeKyIMuM>VUB`i9vmo~VюT!q)|Ži<(sf:2CbYcw"{U;RlIAt٠drL<"JJD{(=:}7r`A|i;q[V8,ڄ+O_-6= B@1,-3]d&Kv+LYAcF F8 #}"1K!30ɴSȸVIGNGgGF2ARr9 uF}*KbTJXr(#,QrB NyJ1H.P 0d~CLwK%i nZbcw#;>j{MEˀHC ,R_%ɀ I `#YR ^.Sxf8(30scP9hXBeR k;NR'5D1CԂA^v!OnNutкZ6~»~bYCibVǡ =qvhc_>bSםe5<Pͺ6+gwh&ۂ/N )ƣZnm޹LƝq&TyTE3 *E"EUI37.fj 6RJ\9Q1$dFoy~ *Wf`QؗơIXO {U';C-zß4$pxFHm冯Q8mʍVTjL MTa8 }в燡yC(X3I1IGep.ESFeIóHΝ{gP7f'lRp}l&Y)>0IKQ\;Z-8 oyD/?oTؾۦ%Z\Џ"|Ozi١^W`VW Vk^wo:7TUcCR&7Fg(-Cc M@ZꌎIխScg%]% ,SYsޙ÷~`pVJA2a1A}wWkGT{{ee,aL4 sd>(QLBW_vLIQq!8)ܨ*+L @$1ыX iT格3d,I IGBkBK ZeYZn }%t5Yd Z0r傓n$ %BBhG:ɻ΃I/ߚtPhJn9ofCbxIsed|ٹlE.C輅;_(vb4o//\J4`640̜H8@")u A(2u(hn 2#PIg-k/wiv ܁Od݁{@ LL7iӨdpLК&!m{{Ŷ|-yу\I/2&2ic%3̉( Y8nsӄ&Gb&'΋#I\ ^&쒜Zt{_5 /xCz9Y ˂3ȔNJ9-uC9dtɈ Ws"kOg5h2xiܪŮv웧o\Wot- uRTba*] QB 71%#E~GnId.Qd.Ed $\JH510$e^ZAkxr UXU;%BJpAZ"@2ZYecBr"Yujpk ݧ |l\CX#i}K{]T'R1ƥ>)[r}*%| Zi^·>}6%|=޾f4{r j? Cij/bte:t]`d8*hQ Jzt%4'DW0טS+Bm> J]@'DW1]dvU|tUP*ҕB4]``d'c rkC`CW8!"˓WSv)]Vt骔HW+Vvnt=dh=6thlVFb!F @19WFFFr4RoNhpֿ)1RY_݈줌`OhG'C- J=숾DFҞVt \O Z }r+dCWpfN \aO Z0}Ұ^ ]Y@RB ]d |vφ DOMW6쉷+w'c=蠫=Qb8?]tus˕'DW ]B ] NWecz̖v5/.ڌAZs7_iI7[Jҟob+OƧIQqwxI,mK's#E)gA*tRJaɯ]'qhwwe.~ǯg)qfD_޾W)o_YY-!F Tj.J4c&^^~VpKë_gNXNWFl놇zW21/c^▎.?" Zlj 8ǖ?>KԵp–j:?M׍Ju΀9&%PJ4NeXi?Qb+f ПYayuδ D *{gL23zr(3ˠ`C:9١vSMl:O9xL!3"G#G8S(ͭsfAϨ=* e9bA{c<XWI51Y31hLg+ H58 P{b`^~ VdA"3,u)^ Y#a9(=(x:2-TH,Y0A;;\oTZP1$218 )69iZ1UԿ/nlo 6_>P@[mLO۴{ qemQ\wt>t>I/iϤٕPr iSwMl8@ ƏVy,)u8,.Ѱ Ͱi|VIfH)}Ս%m)O\AU!aeD;4;Ǹޚ"F/Ŵ]el?N_fs7bG(aK=akU-)e61 omD dlD 6wU]NqMo.=o\]Lb>&їi:ZRT~!VAx=]5ur˂_?/`[ԻUJ}zEfu}tcro;~d2eJ_[jj(^+ts4'Ҵ3[_Ӻjw&L*cd+hY.qbm`P6?Zm*-eERo{Jg ,"RWI9QL$i[(X22!N2RQ*9TuKgbJ.% Yf6>}QM\dee,aы.jpI=rN S!oTd0 %9)H6$ "YB>$,eh1ƺZ\dKoFҿD!)OBBsk#2* j58w jz_9ix2ۍo}W'zeeE٦ٸJ ~7|;kEiN.gU#xR٨-6Sp⷟N&9-w:Y!B!gZA$6>DOzT*3j9=3YrAԓ 0fEt)LsP.IfjpfzJ5.63BU GKDۖE3(ٖ^;0cmEw4n2}gؑ$NLz:pI(HLЖDf5)w12s뺵x̊{^I@()5((lL#J *d!2@Vtɐ-Ap$EǬ" dT'XYV#g>lz:Tc_h*kDk^#nxC h)=D%(:Kve29&/f[I \&^7YNdl@r2}tdC KLKC>Fg>'8m::&_g5.S/m{ XA 4$mPBf,KBDONŹ!Zܱ>|h}xvSṦߧ#z_1&mN4QA;я/h5_jSmhMxzB0!r34ehJl Hz^: =4HP ܋7Q&yKb p.K颉2hy!itpBEBݸfQq8Hॄ@)h9=+E S2Fu1>coJke&FHPJ8h| #NI)aR$MAIRjE.h맘<`, H2rN$H9V=&=|CBP*lzBl[+CL2}~#?]*ZH) po@7:Ye~(ɎlEaZP,6Zm'H֠>̶D {iCFD L8p = e&C τ-* Z€ʐDŽYF$iA$2OjY\ep]&Cs1nJDŽLXovrN/fq. ON<C_K!3,gl̔tqɐV^Y O.9%]29͕6QX{pPxccR W oE1`dL#5$fKB 5r.u6"mV[v?& Hoݒ%1gE||Z{xVwЈtGk(M9}~2T}hB `X^*d0I& $/MZymU7xƓYZkM()Db NHaeNkT_l(e"$uR$Y@#!O%S:wZ{5ǫz@R4&0@hIb5WNfJQaL\PjRG#B52+ Fԣ@QmnH?Jx/ ?h%~p5. peq0-*c0%IB(KfAٙ^WWԦVjk̮?umWVt8Mϣ6h9J':tS.6f,ip _% Z^}|9_l\y[hl^~3Z42(]wKt1}DI'0d.&ety"t~Otx?۵Lx|OD|D.]M!9-f˼{B|l.GXnX)Йˡ(LΥga9WM p1^uax`~/Rak>݄!9ybB Z);u)K}!Y'^lxnc JPcP5w}.Iɸh])K>rV|zdANj t#2g37+$C!74 56s K !'C(Zy24?;*ȣJ9"$P'|btiBxΐs{.:޻Enm Upt^H}*&,E-sµ%?@-Ϫcf>(&J[M.gok}>bzi:>7=^}`EN^hJ(u:5PAPCk,6^\()UṪsWg/𜸑QO1 B0V`1z4"`6k!pƲu] !Jk9 |8,}2 _;5Q![DP9;@; zg"j">78n}vųn=:z!t`(tDKL,h_R4hf EH&$,6(hIOI{ԋe]nKi/C=xWa'T".whI@9KƕLI`״ȭuwo_/hi;|su/h0Fh臣0i𗛬sc mꯃBi4%b: nYln"N !( t[D sG"yJp6o54NO N[F-Λ,.hn9i۞݉|_r\$(k"[dח_Ǘش/[) ѹ{k>lui8)*/o,NZtM&ܒd8Ol-s>h^K5I)5XL+_,qhHB.VО!yM+꽁/ t* *,9RcE5@hfZrh0IH#yY‘QQQqQht(*S.Gd9 $F:W)3# >֭зTZ*Qlchuz̗+rSv*-d[kcLgYXdQd U)!ǦCu?0DUEM[NlKٮ[۵~.yaRȀ:P~#ma|w?ѕ?/bH-}O?`NOc/?.^NU?Ө.Fei\Rл}@[B+gB΍ [c݇IwFP0ŸiFՎt! 톮ڊOfY~8f{1!aqx^?/؞|~y:-P6fv9MNJ?y0/'Xh~~ZWKN)T~H/O{z`,?Tb+Y)sF撋I2-m J9J߱Ą=ghOr!sVu t{`6ikq]L|OtY%h&'LJZ˘B.ez@b)*AGX$#wh{x\ D}_祽YszꢂIPEk j@rl8OR`ޤ9PnsaUsE w((3L2U\:oCqP@0 IۨReD+ }&%N`c'eYxZUc'/gg1;X7,S*'#ڊ[9i7~Iء! Y8AaV$c˿>gQsyeQ%8nwuδ,p>MqZGK)aV.St-RcIEITr0'ނ.9Np*4&g=&?箝ؿO}6&SO'f%cJqWu~5M-)纔N®$5矒eGβ =o˺,Ӕ蹆D^/֦6Ev\J-hMp d0F>ZIuh\ V>kd,!j!3ksY.QDalH.#‹lHv\u)06Cܲr\/N˛-}b:Gҹ]8"%ر#h(k^flZrѪ^5d -xwR1fxr}Η`WY{_o $Iz%4GGhuxw G!ʠFvKn?9κ:cO dۏ GI$E FmLR?1r=g1wGqYSNyZf)FR@jdRcFLK)5()'Gב*ܛY{[fY89+yŚ!wVVuS;qsA[K󏣿]VF=qj8"uzjq~4JlC;[Zz{ION '|>jLJ+K[A}{S`XF^-n4uӱY( d;1{?xL\i;NA\n]|'^bT ]j&vq#+mwP~y9ivt6'r.T[]QiUzc{8I@4߮\Xcq<+; C^|E˩m\)}}V#9m޽X}LoV^ټ:8Pz)ayf[WV }w~ZM:TA;35ɑDt85&apڎW1jtB۴&;>:xwӫ1n\Zsuj>\tۅe|)W_J3ڣ ql|ਉu[쟼9/_y_ _y;hକ|Jh,_&{ /yhYo>chw14ehs2ns~eܫP"2_կOحS*kk1H:z_ 6pahtQ6Oץ=Ca] q? 0[Czoív<~Rߜ').5&@xNHjx^2kIɖ>ޣ{h'\y/7+,y;{yV6tDgsTZ=qoPH9lsY뜶|=粉j fJglY13=#_sPyr1ύI5<[`ҝϳtGdC>}b~i E5.?қMl$})eݪEl-< m7iX_q}NK؜14`F U")RT: ѐlsMOmn9~6q~^]1!渾|c9:p;~wOP3A=I<~ٰe:^H̀$% q2lkvJ'\w_\eA+. bML1PsϝM`7#{.ҭ_2Ov8{cN7_\嗞gMwvmOOsA͊7ƚ/7/,%Ϗ,"Of~}v' ڰ OLPh5cZǽ2_IWw&bGw=<9zEWO`roi:s{;vbD,$#}}o k{S]*?>)eZ$veJ+1Z֏,rZ]\ nF$FӨw^)R0[ьQU$癐LBDBSX,E@H 2x7H6мGw؄Z I@<383 6^j1.M5qT̺D$'1!D͛2܉D1!&DS`; nx>o:"JQ+#:Xe ,Æ-wAŹ]@,͌dP 6F6F4`=|Q@/f9 HTƗ%|̡HB=s^H0&SAʀvoa-͚%4ܖS%3cY   WaP(-f)aeuV̨ @qroqђY/kFS ޺`fO lca: L1P]  V;#õjas;W/g/cў]\7IRUFPWw@7j99B4c`da6Zn+~(dJVYuZԚZSd{I 牐zhvMfc3 AgwC#7 3bO6V Dlu@֚bC"7,4]プ V}6ZX-1󁃮򥨷K"+sb_`uD`Mnăn(@`HJ]8C){tND4hoQ  L5nBr cq}?^ `|y .M!Kƒ]ZP A@y 8?.||xM o!L^Rne >2M@̯2`C"pH ] 9(㑽'@Jkz$$yI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $^- 8 ( ~($\}'L! I RH! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! zI Nio!,#`H NA%= HFHiBI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $B@$ $_C Z$ja H X GI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIC ^Fjvyʡux}Z o^zhw j .i$?p `|еga.pV׫|5Wn/;'uwF~^*U&KL-Gǁ[y32+ }wӏIFz!bq?7T0؞֧FEt"O׏_j>Oif&`M{}[MGM7c+-\X߼%8"Qfc S%ˆ9:]zSK/o3 gݩ%S9T B MA㲔>!v$8ϤOcםW@eg)|*4.D)wb+3R!sFx8;9mPY闏"-I9pVŲȘ+c`Bzvž}]7!gl4xkdb$VKQcfdva㬉=Ty}*r`tIH1iX PDH~Bxt>~xΰ?{0&"/,C鉰'BijR_omrBVۑJ+ZT?CNUW& cQWdUNƹ|Nԡ,-{uT+TW6cL9uŕsΜUfUҠzݲԲx齫_|&3g>@]QTWzj%U^ܺzi9az1j{,gG)vn8./ys A?o՜M)D iTB xYpd/ЮniH9P/XhnB1 ?_tMDʺ@:M' qfX0own )})hIbRNUJd>nIg^y1av \.x2nt=0f@DvY_t<7[2+Xǣ/GՑi!͹sܠ[]0(Ўԧmh\(GuӰ2-S2dV:9E*zm՚󞏠~IDz#0J(SG5kI6|)֟țcBc#q/*O^3-f[[vLݏU5{u:a胊(S4rUN]`f1#I0.z%,aJ[jF敦 2UH+L^Kev-ĊjPi({n6HPu.g667>m?חekZsΰ[4.huɬGp_;Ich$b$}d,*]X=מvاl"!d:{^v+75dҗ. +N(w|1]fշ{EZxa ZԩA "U1ܗkP16x,s_ƮSտʲ(5}ſ׽v`4WU4Z{letɜڸIB&Dҧ:3oMKOsF#:"gm;f/%Km+[lkM-Md:MJꭡJEF>gkkv;(YU8߸Y\W٭~]$N !T]߭BpsyUH%̖tЅE'Gxūx4 *+/Rtwu,cjp>t11TLx^ ƗP91^?/ɹF6[w0f|N;M4's/fEcj4c.f:H6V0I{m򝂡9hNpIhL: )j??@n6&F|m btRBg o,y*%EsU,X9,://9Y$6i`];8K5:Ĵڴ.f9}Ȯ9}]~y5j:a7E?T.u&I^whEMڲɿIk[ QQ8KW \vk6Kd`U]챭?, ܽxhjtO{6s~skY4ub757_xBE wn~Cyӫ˘Σy}c~iz6S4VCVin.3qA )x3m쿵R{-軾j( dj~(Yv``VWS*B=E?*\8}Wɋ6 w"XU9ϋuyV~N  TύNgC'+cSyJ'xCH)x$Z0є9YN]~eb,!͠' -Dg|?^SxRR<5p1rM.EDd$g_#?c狃as#jו?ytbU - X䞭N:7.=!RIsG,uZ;k.$/`'Ec<#8Nφv`,Ϊd)<:$+3Ζ4Hi3erTkdP3µ<[ڧyh$\S_;SW~z/Y F ~u:\mbr,s$X },DIE:WqM6Ī>E"tߨj~,]V'[?BO2ˬ3&cʗ/z™5ݲTEgV6s Mr$q\4$1H/iT, pTbAF4^d9X c>-0sl *8!w))+ :/8E Mk˯xij D`m8 N10`Ű%N#`Ƒ(E3Nv'co5 #m RjR[IB DOHҘ |[YN~;O/u~fQkӐ ϰn/Nf*o址4TaMkx>d \H;{?on5L{.%̿IΛw3Uq"N;V}QGԀ37erŽ\+)Gn4Jws459UfУ3l~괒랺p1oQO9,a\W;v;z,jBºr't0g߿ߞ݇Ͼe={㷰Z`BNֻNKvP;p|֏hx܈m-۬u<׶Y5}sUQ;0[[ i/V(dmrMקejWFWVUW@lР;ꄦZޕ5$翂Ӯc{XGVe"`7b7l'WO(J# $yg5 H@6~НIHp!kUc hMʻ]"fk vګd@$' ^H[#9X-*>&`JTaE' vS;T^KCr:Z e_wx"x e*)Ha"J[=1Qn4,qƒ+qoO +tr8DwsTޘsG[`՘<;rX̬̎ޙ[U wD>>/S*L]@$hHō!cpFz#0L+ h+[Dh6 ,: LI/556>}u՛ jmn/Lzr1gX(zr}(vy=N=Cy^a`ӅFFC֕bR\|X|헐3؝hyV'@6k*(r?|ԅ2F$A%4lrS_]6q sq \ (ku/YyG&)7V%#X]z l)!K·zN kY$meI#֓뚑Pj[тƾCU3Ԥ;֘H[A9ːwF(,6fa]K\!D.:Vyo-˅؈Tf^jM_uulw}+53x1b$za3pόq1BŴP:RZWj= :SL;P-Ah/RFa40-M}3,٫f(z:v-DpLyUi"W[/>Wz #E2 yዐAA_)zpYU!8&Iv km|paLaw;˯':Qrș PV(#xO25WrtDYoQ;qA<@Pd*mmѤ@"m"lSa֙RN/%G:bHE"g$5e(>&,F:b׌{FTٯ`l{H],@-aL k)A+`3d1- ;cL1U?#-Ky~:9Y%$SBJ6@hrD.5fli]bXZ]d%`׌Ov ]36Y^{}wc\nxr-׶ܩvg30V2VG:WU׶1Ԟ. l LTRQX0B Ŕ#rShR5ֹLZ*V=% fQP#FTXV) b%Z36#gflUf.ƺPpۖ'Nb_=lqHiDojAfߏW3K8eâ@ƨQgaE4bH ,Fv~ lj&D6dv^ G/-B"k=?95vM%s(Zw쪵McmF=";mALC0%ƠIdЊB!JV^C^jSBXomDbH l!32VtY"UEd8(UdبflׇQ_3燹T8FQ#.G6,̰Ƣ˜!k *8b-/HEԒUmd5jkȶeI$-=Y:#(z&[D]\JlIiJH%Qc،{%|.#E!:qɎzDcF8;30XP` ЁE=Q R X+iFD\sifԋЋcW}8; ߃ ;Ͽ^Ų|{ |%(Vӄ'ُϔYB(VCIK-[T`],@@Iɤ/&@QiwSv\ .=|G=|Gcu E1:&f@ujp?Vf呅](cg P|pQ4'銷$&`6 g0mF9Z/(q$T.kM/]H/et kVL_Nѫw׫fUsČxIW^e7fCQ,AV6Ql&!{v)7-ojվQȨIT\ID}ƬX2TI6yeMemMs)@*BF3 v+ #ogQqaJ6ƝȹGzR0 !|]$F 3D3l~Hv)P;&ksJ (,y5ִ'M2$Iy(AJlٮK&/d+%t, !GR6 %-z[soAק>:#l*"tEBNg1wXp!@^d!z`$SR\=)cM%~;0E:)a A`6)p֑).bel@26V0B~~{vsmԼ4hP{kximﱁ3]d윰t B񙦊7yi?S%N~](WY$}dB6*RA:ADfhd1CIm3%s,$̵(8%Zv ZYT %Fi!NflVoBMjoН-i}r^uݐ|cק.ԗ?- uDTY-A&kgb2Z86 !42vwކ$xֿ7@6T_^žA䎭I+Io}L jt}UNn<&uy9[+Ũ ;BuuX!Tىt]10\9\9\DL+ JC`hs+BLPb(<6!iJd6B % `IFsT((m91"kR#ա؜P䕧F^?iOPl}K{| 7RcsUw)Ruf_(.ߧ f )5hv6G"a262#fEqs= &W% ]*)w(oBy*DJ2Qf[BSp@E&!(e .lųZ.1nF=R1x>qJC;faɹ`~<^;#>iqý֥/gGS`E`3ۀ/DF6M2)/>7. bV F؋FfWP;*q`rZ(bBG "/҄&P-"XmX48$ ;|O-ɧEhW}uwE|[?i/VԑU3fH EQXtbK `RZ)7Z%kuu +|9c5k<D&DJ+zkaR)V dӖ!Q&CW1BF*6ztPϬ9^G/U`]Jb( xbA xxī|2rfR?j}/W/h*Fԃ@Qk0n'R %^x-B䖯\kɢ4.7(7,C)$(e$1QQ6Yjj|ӕ{6G~yM/u+WX,o$ 'GˏܠG[MiANޘ49SDr0_:*) Ve5#{׈ܝL'c|,}3 kr*MKߝM[yu~vzIZ:뽳a֫775~»c=C+6q sՄ> wx.0#|`\B@mנ{1d}1bkI65EaXfy}Ac Bo@npzxQ .d Zc'PBH-1d֍%0FPkpGDULCٌ7vz#\}P%7IcETD/Wf~\U[N]8pW۵ xT|x38|OE4s_zݰvG?@c_g8{QgW'|X)_d8&aI_]};i89?{Ƒ &MB :^d~;-)R!)}iDJ]$Rw8N/oթzꜺ+{'py݁|(iH@J`pr Wq*5qI\QnZEni9HϮ$~?:yɡO̙ubr<³cN=G {|8:P:F~@0qdh&8:T\;R}nke;4^2YU.HV`DD%?w*2DnSCf(8ԅԜCzztkby Jr CʓC@B=C9" \`Nd0BR+TkMqe:$R e"\ZA}Ջkhz6 {WNjWnrMˣ[Nje[Jsqie@BA JBw\Z;b 6\\JP T {W]v[@2"\\MI(B{\ZFJpCW(X \C}jp\JKz\uWRPbB@0$\\Cw\Jz\uW Q3r# Zs l~tk{B4 6 Q-ScU cֆh6L++h(B R}v]ĕ1\ې@1Ն3ĀjTwE\Y iz,2 (W+Pk$WRW/Wŀg91\92䲖Ԋ&pD}MO-.+l0B:A \Zk}4 VpŬ$o&PpjWUqŭ6+lE0OXxB`U޻"$!p5W(+I(w\ZkJ!I XQ P. ƻBR+TE+r@U1sZ?If-O:vSn*g=5=eP1 Cew\J{\uWIAߕ`KT+T+B 83Zpm7{\Jn{\uWSME@BJ+kI(Zq*E]uWKd@%]nrE0BF+PHwE\)j5N1x~ 6 h]D{~ ~f.$Z-$?^_dg;^6 yG;K `5YDRd|~z>\2r-,WKN.XIMXE&LE%0$ՉptePȆIb\fxˋdyyUr'fI"VpĖBs7? ]P_3- ҋbX PL.ͥ2O\{Oy7GIo'(ϰv"XpBN]rRƘTi+o=UU`A?ͮkU۸)OYjׅQ!f7 {«*UNr}*ѫ (ISl$KtI&2X|s|!x1jVO.qe1A.]M6w|cݛװC%{TmVڽ~Kw`(xЈmdPM3U˯'ͫ]v^vS3^oYqfډAGPξ˟J?y6mA \5pԄƾn%Zq"cA,aX[7o+Qo?~?V|(e+mߢ}H~Cm};wjWō;ԳG`装pH+i[a8g[#m4M"#0x[,_l & :Xl>7:W /ٻߏ붯[&[MZVoHپw7:dn]15ޗK&;K">^O uXN배Gч~lh1k`pGԋ$)'+4yőCTo%aFO-In-쟩ۣP8BV~`s23r(T+_*kv9Pݙ6'٣E2ٗ93iMfQxͦŽ w:f6[y7Q_H:O?&ג))!b!46911 #f`NS[i d/k^R(U]lPd% Udq*-&(lUm(bdS$F=tlH}<.Rjl(@+m%%܎!ݚg=Ю, kJBZ@Z޵*kEZNׂ݆5f2\\B Tr*z\uWFID@B"\\iB/^GrS+lm03ΌjUJJ74 mMpgm{~'gŕJ(=5=5R @nb7+TBF < \`kȭ%WVq*qA\qp W(X`pr Wz+Pijz\uWJjCL`pr- WSATTJZciH}W(X`prWZBBwI\)F)3^1rCFAr@9y5x4f&>I C ]Ĵ!\p ǫD\+T+B*+Kpo@`prWVy+Tї+f@5Ϻo;[v0&mMji6ZWWվ3j 6$\\CmmJi{\uW*T@Bo$s ;P꽫.S+UHJ`pr WN*5=:+x}B+km(1q*E.J2GrW(؆+<Ppj)W`qa F% ː`-qKX#қLcLyRI|C.|W.y?k1РIb@$8.=z͛ Iq9iUW7^ޅ௳,fo6kLdE~ƒ痓w>)ޚUz͝ƆLWFwbYrm*7W-L~{Sl>gSnYJ4ҢQr%"A810#If9P0Yq^\βJ H 8KTÂI4#" . Ђ߲\<Ȩʔ`IjbjmYʟ?W5~Ԋ] U7v./-Vڌ).nZ>z9B~saЏG 㷕]/bxdyVNR%V o_{.U[g\$܂=W(2h k>X`M,^};(?Im,vSD\c x_!oN4e^_L)&?22/an2d2AHfU`jbpY~Tz 7.ea+xrytQC? }J"0\mW| VLw87Q;wGNRI:Ib;}2߃Z7EuSt=^EӲZa,b6colɱMLV-[HHsSyʴ482`r]>;ONju^N&[B޴)<vi);vf<:|*;Գ8}v識{3.:&pd^6`EޒzeEGI96Ȧ'?OٰL:wzDA:5>h7Z𷮅|(}Iewmz Wn2n);Q$MA| IFs1s)Z!k6gKYBeo DD2N3U8ϸg/޵#nv5[[%+^w?n6;ܲi{!Oj&mӟv7aScF TٸDиd~$Gm1+b3xxko>ׂۈS*-d[D+.&,^ep] P1Y>q|U#ԁ 2Fݶ О(P*V?imGoi".{@Ne)=ǽ3sLtI>K _ ôRPpE G;tTI$ThOhYnjyәQH%B44ĸ tb~y]l4+v7݆.7|-+-ZtM%Kɱ$\"K%9Z$(^aBvRL/:دl ?Uf0: c .%!Th87dɑ!$-UzQ*)eiw\ƪ_XQG- $7!Lp޻DV!S:QBL"~RQR]iu$ %m8eSL4V ]<2E0%P y\ڐ;R]C:$ YOYflsJ$3>i`En2CԞ6RgRW 1Ц?7^:7*4 ~k_FfοǻTau(~q) g  qِ]-%̍nR6kߨ4+ KjL`zq4woi@bVhQOښ\Z_.8;!mgډd6J4\ ݻp>ZSW)jmk; {E]_hVX>(zCk-!itqZL;ۋ{lT^fr5{28{(V0k{˫J{7KR7K?czR'P{ԍH,+G2Ęv*Aw;]dáث`{?^7=A wC:.4ұ"`Ԗi;|xg0?J(4I'wX? ?|ӏ?_x\ŇpF`B5IiG !{w w-Z|ـӯiMwSEKha$7_㟺Po? NVQ6[VMbtݏ8Mf~(fMRyY*y8̎m ױB0a@Z2:ۓїYk3OOʓy,gF'- `dIS\\y`3zd*|IpOyMk=:&3ly|8p_ DzHH׀>qoH=)8ӁkN&Ƴٖ7=ӾN}q #fQ-X5= {{ ~\s M;UН ݑ̲;.\ќ0X;^*}AHhlⶍ#\ &/T)qy <$OJA|<2 \x\ṐbbR$7^&ΉlD`F"pr,vN}s<-U(ӵBק#\pX*5 ꤈\8*:eUІaBtH"$zA8cOXrUӉO_1ڀ}"]/EVKի88[-Rjq:8;<8(Gӎ4/ULjԠ9e/+xTAH&q fmP`N+pW^C4e1BthLGIz!&霹"xqBY==t^N\Aq zߧNgfl\oVˉua[c?1b'z}pRmܸK5뫫 . XDٔw^4,Ol;3*>kHkJVnI>e61ed@Ы୍a1@ ŵ!GJkִA@nS=k7:vЇ iJ2\۞ {>ո8]] ,ldTT@Y.qȪl~&?cI> L9/ıΎT>=ރdSZ㬇Nʔ$ڒFdIfKu(YHRH9gNT\JZe =rRyo9@"\]D&BxQ^M=][#r ׷O-1ג\(dIEbBvzaʪRb\9ZM]wf('iqM-Qk˫<$d$FHEt*zcʆZM݆.Gm4j\< "M/3Ʒ0K[? ~uǓ7>cG83:p4I^$żLҖLۤf5;,`N{] 4 9"LRHq4–yd$Rf`R=cA?.Qciǡ؋c>W(__)Q(@8MѪ-T&d]NreT Uɞ `,*r Aa$x(<Kb6q\xFɮˠ!0`F@qepVs.-i+` /%1 ỳB o`[pv8@8t5cy '5#3_. pY薠QqP6FG"p!8CS$+钦3 ϓ\ #O1[.xg6$ IChʑ"kJIdl$>VB]U\IABb7B[Rs#SdFV2Fu z5]ꦎrheHt9=H :^ Lf]d$uH:^ ZM_g=ނcL2BW,\qST ?x1Cs硹sG;!#j.sd&e^hɣR* @dvk r!% Ke$lY[ Y`,pϭ1&dὦVvMɒWrǯVsx2͑DmeN{K<+#W:rKX%%$NHH49"(`gL<}30<9hX` )*ńQ&!'L1`<\w{-NZ-ԺI26 gNbG+%|Q6 5dpgVMbja8c9P{Em{vc<@X-KmxQfK|>WNR4GE͔X[%ZI*py.gCN"za*CjӶNii3On^뙧}7/f_Uz 3oVԩʊtSv7ߚؠ^muƲV74ItE]` 2)o\¯'YAtF.뀚_NEɾ-\ٵ#~})dɎG[J(ؽ'd)&:^r8)v5lKƴަBi~,W mty~Ѹ+lƋ Wƻ<l>jTbjz~t0mя%P[cfno_sMM!9-n˲{l|B|n&X>11s2ޟBb>q9F?u6?s1{ҕ\6xܯnReU*l5gz^ݔby̗bb }(w.sg@. ̢V6 ¯/2πxN8R~̜ܸ]0f-\ɨѢ:$dFo rJ@z^UFaWڭ&m-څAsJJ!#2f[ψ~jdh2q+NW U&sx&tDO -; ;WyGvDT:#Z $J }s):eE[Eդyw6Z섍ܘ`cdU^H>}`F, m% ~qu2֞gsc}XPb<Mwϱ?NpH-FsW6`W_^ҐwߣXrٛ.(h zώгaw\zSƾ:]X/C^Uj?bh/ute:t9cGtAJe{CW ]ZuP6]] UszCW0}e骠\% tuft%q{DW彡++8 ZUA@WgHWRݧ+lXUA+;wUPHl QquEV{+D_誠;%9)_;^X~m 5\Cw7ꠍM`^`rv4.C#%-N8ȑb~p&9ae~}sA'& 7N0լG si8}+\*peo SoOWNDϑe Z#RtuteЧ#Ǻ"ƺ*hE J)ZJݖ^7L#W7Bx]JW{c]RnYWNW˱]\[fzCW.5N8z3+a#F0uU" ZNWƁΐ[Uڹ{5+i1u*( ~LW:]/lEW]@WgHWhGtUk*p ]Ze骠]#]R9Y/]Ge$JnHLem-;pN҃ګʑqM8o")||!U 1%K=|zVD*ZBehwzǃ`-V+ G/P|;__ߚ"+Yj*d6G{Xϟ~5}EH+`pG-2"E>@9oWsWC 0[y N(2kᨫKzފUZHܫlTA'XJ|]]ZTC7(JjFb Df*Č>prrmuPk+@V^2PQVK˶ } #G[ ѽZ a_ ZɺPPpkA3a]?7r*he J΂ ZPw,RZ4m}_^'&&ڥ_` ="mIgkW~xnFjrJsϴ7]ݵo~O7s|1@JZWݵOQu|$1kƋ/[.vMƘeQ(x[ IntݽER[Ah4#DqRum㑷{5=|oXvh"ʧ'f}ln aR˲s8|ogsE;ҭsdf&'dMN\(ȒFY\8vllBC"gOa5dBSfYZslԚEͷB'\fkR hBV` 'ThuJ.%Pa۔"1bXLjVg.s.{gBmEKSk0IMrF Y2Ι(2&Qz0i8XtDK]]o[G+vsC&0L2AП`E4$:S,3mZYqD>uo{G҇ 4ﰙИU07T3F+PtRU=QKʇ<h`DY;mϐu hQ0vp#%cM!:F2T`1Cc,cmFM#d c*,QN!yn/*6<f]F\R>7/UCV1\bz@֩ X2|,R!IBާͽ4着e7-GpNTG*Y76E5ɍK6J# |hUL8'֤{ur38:AϘkcmƂ> 3!9*_*X )%$ڨf)@_"$U*֍KA_ZhI5 I{mg͋j#rsHF 9yR ;7g ,(R>` &`֙Jbk#Ҋ IeWhBJv'S BAnQxi,h**u(:ݡ-!x}9xk@y+ %eìaD)SA!V/TE b1:±e6P6ZFJYـ#XȌ,Hw0`TEg/X&ƬG6lmi{AEF@n7((ʡ7fVqC h~!:o>/>bA\ JHN+ zP( / }0glo݆`m BgViJ)C2~ȃ 8vGyQwEa2,Ƣ cA8gPH!(ƶDMRuPkV]@=`5:Κ&ѠTf5áRk4KoMu^I"{BZ&B퐄M٤$E}>~Vs`/ ֛.l:X?7^NWuio/s=W&qK}`0cb5BgekOak0ೳ!b1j1֚cfԌz{6 ^X-p*%D:wǔ@N: jo[^8ܢz 5'Pt|!m/#V^IպW7 &횯^~\_t/֯7gk3.V_nI?n6/o/ϵ g Eh> _m?isuq?zxթtֶuskbal}9_]b۱^oV&НQw:\}ApUXܒn ^/ d!؝@@r.:!'3Ïv'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qu9aIN / xN e tN'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q(@ -@r@!( qmPGN@Ji+N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@':PkbA/ pYh;&qEѮ L@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N/ nkt`z[My\P0{_#L  ,ɸd|f9%%z}% FK`\?KLrI/r:b;]1 DOOW&zb: {b[ah}x: eEWBW:cGb/}t(}:A2DW sѕNZ ]1Z'+K*ҒAb@?vbF) !.8+RцttIi+{ {ظbBWS+o,y{Cu߫>'v"f/V_m[Fxy{f&C+.Pv W7ݷ6gܤͤJ L9 )O7%hiksfh?w|ӡ+- `ZP:uh-;]1J"%+vZǰ\ohQJDW#ޜ)y8VW5Ol>: Sur(@WFS^{r6,.bBW6c+ JM$uQ5f)tha(w<BWCWyʹӂAkR~n+Q-tutwNjAt0z1tpPsףjT:vb] ]PDWrZb^;]1N)eKjsymWe_Vֿnc{@W uw-U_Vk*goo~Ƽ߿Fyk)B4*4^=f`a?]׷z{0{}1bj?Gx̸3WWسCon0e멆_^rygj&ogmPueByuKpnwo37:PiLTU+iNf~ʏXaSCn`UȪ6h1zG?jiG"" >,gpG'eUzbSqR![7\GݓL|}3ڒM\zS~ VHCM2QN~JɨhDŽc>]SeOo۵"]l!)݁7sj2kCo  ov[No|,O7{C t .rMM{Mx nItŏ3f1tpi1Ҍ6}a9=Eb Ss\nb/71Jqv$]_^N4åQ$tЕȥg*)}6mO*}ZD; ?` ]}dT +&.S]6c+F Ҕi;ZnV./om2rX[ǯou={zv,%8\r{<4G.SR&%Nő|+c(_ݬ͹9ï87lcyg7nL뗫߾?]fw.~>g g3lww?_kd|%+O:IV>-C_#+pn2b"`fMچO1Er9l%~ ȡDلaYǯQFi JGGт$SYB`St+}ET . W5[(Ih+Y0/rhSˆ,0Y0~A3 WKwU⏷o߂nȤŕI?[2,f1B"7,}.,{^_k`@ajlhr_(_'-y0/GOcЌ֤j̀_(Ũ]_(Y׉(wuY!ɬwg6u{{зs=I&f\2RC_oeOn6&~ 9ט{M Fĭgc7pY 쉖 Xl5kB,ܦ"6y|O4a(L_@G+7‭iU"bK(F}qoR0wi3"}r~(4أ'f́Ky) ;d|G1Յ꡿η')KIdHc79:]x `kO~=Yh.4genWR9}Lv!$o)@s9ƹzjQy3f Bٯ}G>[0 jN?4vi(* Ney.PKC`8L x}VPޝ;ng_X& m01RLo`gzqf4J§`&m.$բ;Wߚ^.GH}Qb{ :ϧ9QL%?x%>oFyh>ʛ݄06S<AR[ U Ʒa޻# oX)LϘS~TFدX0N^P f8R"GJJzqD8ҷGҩsN(:\J>v81S" 2˜dv>)Wu:>^HcrEDjaT4"KS##R`] 9bS7١a9$$2^Ԗi CP q`:-RLp &"M'm(rqc%Co#%fߴlGy}]6Y1T"͔ҙ@ cRx`+-LhgSg8ot*+HXG"\JlBLWcIi1Z-^j}ZXV:#g@SK `5A{L`QN* \kBArZS(!)6a…VmZfjU RYhHs*"5p "JFc&=jSNʽ`,Uo\JVBHg#2rR1TPǤPXyPV͂܄٩"?U7LtB[)VZ :^<iÁ?Jfl9L!2 ,%&i$h)85b{-Po} €05^yD^R98p!3Y`hdֳtA,Yg2DO{1E4$qJ7Dd%ѤFqP3rv+ЉN5ULx4 [Fq6XbHȹT7~|w~8ҖZ\J13imd\B +Bν1B)O6&;2(وb|v|{oɿ!T5wkȿ,!HLR糛:bɐȌ D2c̰1."v!'WHOD![s*H%F;$Ax%RY!b\GPRiBP1G EEudXDcZ)"V_"Dv;#g7F iS*)8`7x Rz> &v_CC`H`x*: RWk9OUo1.|% ޛk|23uDO{pMR4ZbR傐P`D+͉"tD谋0%\F,BwH# be}0z)#"0xPH\wFRfָl:|-)=3k g8gL?xY5u wcy T-x+ٜ!)47R)L`7J i(F9khg+S $‚@J:hJL߸"JVZz{JgZ|nJ;nQMAw{+tMBͪf*쪶^:n rN-6P'w M P8PoXLf+y6Y +#a %F0rQ8lHyt3H$n(90IqYsxbvaEm$3%d@%0$yI v3?jBXסE>tEԣ@Qמ&.x.F/_{jjX Ԛw䁇/i `ʅ..j'vuNH6qt&8+fEp e.)uS`Oٻޏ|1K>h̨Hy=yT҅^B72-ΟO%+/]nvR\L?u]6qhQqQ}bG"-3?R[(]ﯾ`qTMq'G%N`GOxNvcR `[*͏)=@ 7BFfOӠqsn{tvqSKeCMiwoՑIܷf*wzwls޴flxSfjxdxצeV{ΠV=&l䞎햁jwѼm,'"F?eQsYU>{*r e\K[!ШQ:DpR4#2e,Zpu$X n:Zz`P<ǯ JXcRΥ ,xTjjI kNt$tUqIg׉K⇬V>t~X3X ˸\O j;ą'^pG)߯>=W?S10;6^2Zx'O8%Wr"0$C`3aj4Gn]tSrToYRnjZqGnޝBYe䓮ܿVq+Hʟ R3R)u˽Jõm/jn&v낏N1aDD&6un^5u[]fM980nw`J0%&@{vaɅX0v3Zw/d\Y~s.a UGq.3X6MG]$%-jHE>KgN)If$ȱb~:'");x<{lz9V_׫[B۲} % Կe4Lrٻ˶m678|'7CUNQk%C5K5jJϐ NU4K]jUPTU`Elgw0Cp=F0G' נU,FO8aԪ#On R e"Ur: 9+9KTl\ J0u]LBIB`E,& [NrZ-8[Y^0nKӯ +;Tw^zCw{6<3_y|0@ zTOyeKػ%$5& zy֫l_ !ԱR|2yUYӹ!茡 3eh(L,Lom!pjUQ@r^՞@h##ABzojMz,71$kv*U>|ȾOmTEfwkR۷v(A/ꋚ}GXe quC[tg"Vn˦&Eav7~T n!u1PYٺí[@l`66z^5b{F0LWݽêdσBkYu:SoxLͬY~t{R$숇Mq;?㇗!K8yayƊ:ڴ+8di~5c.O$BFPte ;)?p TRpvl)[EW}IAn>*03.4+gA0DcC 97> ^ zZ>=^oFm9Z&c. (Dks(. dt4Gwڿp XӾ:ol>+A(蛔pvyz͝\PSzlktpƋw1Tld(Fh"ƠKNQfP)&h]DG΂Ex)x,",s]n&'e )kQ)TtgQZ ,d4V `nߜ+-C4iW?ֻ`u@3ض57VNrFBvy-=? OGi_tu$+V8HD6ZǦS+7YJ = RNE c6ڪT[b)c(ZCHVۤDcq K_qHq\(?siIwX m4x6OV]r0~8;mF=[bHVxHGQqj01hbT2+j2'zr1hxz9rT%nuF]iIu\FI>0~Ar0 Mᦆyӛ GӃ4<_o[z~yr_SJX*x0 ??4w~ dWCx󡵃 ]>l2Wq/5hig|= '͜ϻ,ubdxX ף<0,Gr nRe`BQn!ı"Yy$)o.Ḓ%WIQ$N(kk@-Xs0ZTLcJÒRJo'=ayM+=$  vn}(k9Dٖ I҈( YB%]:L컲½ov| Ξ=?5nj>Xsa{Ά,6W\ysZՇyv}z6jF}s5]$W PH7oB#Q2d[#mC@,zGIc1N3@t8R$kL3Xb.`+92^]$wErI9n" o.ߔʳi[|WIs.45v2S3Nެ:ёkEj"\k'l풧s]ZGZ ׺ hm͟(:LoX+>j*Xmm )%R1ɡ/`dh'nNr5PPMFV')kVv (hT6XSXv{#L)!)luưKY4%L69MM{D AX'E1bR%S,# yMD<ƭ 4b)K&bԭ;dG&RebMQH؈Tf^W:Qìai8q3ٴ]za3-anpq̱L-Y4~_sVϼRJl} ;;So[Ώ|}|<+<C>ODp8f8' =\O8?}aG6ètKu墥͔Qa%rطvf.>nW>'HD>w.^[}t=Ŵ|5sr0C8=~m7ߛ 0QV d40x4lZ|iP|U;qp*Z@`=+a80L76I?U[ PhֱMP0D^TeFZ|kcE:|m!ɆvN$ f'!,:}6eL'Vb0M {Q#VkOl\BqbdZPȳC>f yʇÆUYΤ)@ $moh` ?񤬜赬 Ԧޜh;V #UM&ږ>+ANEkHUj"e"6 "٦nX@d('\tRǜB^TEmdJ#Y36w:#;v7B{Tm[ڞF9٩ų#cV_ \,- ;cLdzP0blݩ#֫ZSy4)$Ad!J#s M.u,sIR:i"lu'kn7:y[n6=Cwsf 2'rv;Ux2govm)*ߡV`T~ ы "[)) u)( `f>;,1z898[r.1hEKC%6U]aR6l)Ba;#vdNW ;u:BcbႢlgU3~s=IӓmqdI~4pp89}ምY)lt: x /Vdld&N[;]{^̈́hUaS Q򵆠ׂQ )Hu؝s;b(/ڝqǺ6k7lJ1Rm.'nfYEK,_`SBoB鶢mAcC%̚&XI$HdXqڝB fب=:ȹ[Fʟ 3U`+XmLjh{D&lc}9BRKTpŚ N)e5jkdiI$ l+uJřcr@\JlIijI3rUGI3Muv%k"q⍚âl02T,IgG{Wb%խc(5".DŽRb'va{? GEON]#ԁ{76/N)K6 zC\<яF0b`pk*ےWiqi#>-/t c#5OY̽%xnyGr@=!g_}58N/<;HH_֌ Z|{¢ߦe&ؠZ-0]] iJ+n Zl \UrAm \Ujo:\U*mW<[*n{+&H-pU}lp}8\U* pb ٰ(zvuJ8RQs, uj#lde(BljdtMD0ɱaRAo߻wFps6͋= ajuoRy`[7ow;Ʒ;A5+ORhͿC5Rբ>rٻ6Wd'^qVN_긒(cTs`S X(̓mzw0;;Lw8Hp,\4j,Vߜd~4t5P[ޮl|[vgh\6nߝ62? ?䟇  E=zu3 sMJs aXH4҄`T3B89q@ĘqR"Pg0%c߬Ø" XE6&Ƥdw`<[)l6\glt3}??1^@k  EUݿ^R5KE_}{^/){QB8d3wfLߙ 3}gʒ;3}g3wfLߙ񞖧9BS9O3bN哸S$D~*T9kDRyfaOs}}tVo/€0u_>xIsdNÅH VdaB$o ,B3"ا*MLAIIJODpCLDK9 QML!=o &v`6q &wz76rhˣ27u7};%V"#cdDoGL vZx:HGH{@ɹ7WyvJ{ӼMtd=-6dcS8JOK(v|("ݿyA5 Rg{bɐȒ DJc,1."6{1_ge&넂6#m%K[s*H%F;$Aydr!D>*@I~l9DŽRn&x9֑aYqGkZ )$$۾َBK"=7m(wlSWTlj_N=d Ӻ#C1˜!.Ӈ]q3<D 6\ǭ~4kʐQ/*<#ZiN479#"F]%-Y0bH# be}0z)#"Z!a$Ex`V2&"3MR&ָr<}T-Jgm&s@<On:]{mR2s߫>>gM+ ɔ-80K0;G :hS(E40Bz!&[X4i|Jd#W"H(I cF1+u (S9^*$QJYm/gC)YzB~q wSn H޽߮ף*-6wvHwsb  ep몶^:n rN-6P'wMPSBb3 >:MO<_,HӄEBED8 <:hP<=Y׍Ng`Y1rxj0"m$3%d@%0ׄ B҄J;'w2J_)2]G!mB=  va'^È'nv7}/} drD/ I0a #@!q`Tgq3OGJs by IiB0*!ǔq@|*$Δھ. J(Ja,!ٻ&6ҖBZ7N4Tw)g4–Oy׍*C2J璁I"`.IѰ&j~:`X EÅ^xؗɧ$E4_ܬ;jY''v (|x[Ū0ǣBmxNxD_ 2 wpQ>TXs>y@vŚg{t{}1)+{]xMZȽxkiA .Rڽ.^]$I/zj*hrua0_kwź_Xi7mxHDy O,~jHՖ*'%=Hh:6pKJUx(]qQW3JʹKZ*9H.7䩶sKF+"MҘs%}x׋+ןUt:nj>:mV%RO@:FƮgտ_:&R_C1KnFO ~3 x; 3/?pat\iۺII Lĭ)t)Y-3ljY[}׀1P? ik%Spo6MXw( g=ar{sURPن=ETuulXԝ_4w~ [d8|:%gH)EԒ)Eӄ}C|:=j2JO.\ϥ0>Us{xc>i˹sT<{[XFKi?ȉҲwHus{ Og}Y2]2d %%`IIlrj,zTT84FFx*CHBH8w6*Ph$ц`%3,a!D!e!x0zg՘IDk1h zjH[(b5Y7s2D…ZPi<5F`̝"JpU /`FOů'װ<XtƴNΧ.Pic>\syZTjw#]_M$fJ,-T |׶NGЛݒI=:d(~0cGpY4M  PN}TF[C*w @.} ͬPj+XK*?/ AWdҊ1hf8 oF"_ݯWlu{Ζ;z ^q^A:O:G<䛫ɧ[5fǟ'סw$r-ߡ-uu$8osݔ8C^P!ݚcHȍՅt$f{&[b?.&ms|ZuwK ͕ c]q(ڞZnhCyͼa΃A>Գ|KBf}b o]ŌnM7w-DإH?UI}<:-}vaz.BᦤT96PJ=Ռr6vKeAѕ+W2h  ay><^;*vr)URiT -Nr+\`uSGc Q W]n#ɑ~؆]Rngvg5u BW%>)YIhuŮ8uJGBR>My(Iaۗw$c2""-$[ su>U䞽- $0@TV-bdսa~Cj7ݺj]*!޲DfO:($T!m5]eH/}{Hg\gg.#'c /6.you~\^rlJΩxA:8iJ0)yH2)zxkQUj% \( Q(r|<*9-WdEm="ؒG(pG-}ZBoSK\ۭbc$nYYåҊ@ц- 1$<ˠu)$zt ̈́RH#…2\D"FЪ)cgh#:M'6癒kUawb|yC4˗?z=>3y&Rt$r%DGp"cw^N52[1>$^hv2xddJ <(nt`)XHڀF+JJD(Qr%@\: +%C喇L&ºP<6hqdebL< GM3B㚑^D?gv񊷦$RֆJbBH9 DG3"i 4Z{ziC\b> C2tJ) 7(WLILqKff>6VMϝLNr_3JUC.TG LC'm哀8l`ӭ~LQFg?|Oaa|! ;둳k%9ܟ ޻g!*xgѵZrVQ w6#P(kf W^%tn8ɔҵ׶:kyBu~zݟo/.GyB59ot9hj]&S @]4UݞՕ 7>gOp1!~kw]Z[nk;s@aX<{ dHB/ Öaf bnI is}ЃpZ:*#GOr٨4zH _~ '0@Z$V:yC^K*'~T[ݻAvo|?~pÏ(WOp0kh Lw#@@ 547kdhQG ɸ)/NUEжdo-֯>tIfeԣ@a 0 b~_SlZ~GH?#4 MT!m ڊ)]]>C^$W?I7'X4Ṓ@ t`h9HN}ƅ VPSF$4)Oz+J1:09Ra$5Z s w QJ$ʨ"4Lnx#9*xbKudˍmΑW$LKwM:',0ԝY7FKS(Gݚ2y uo[X c"yak2yN!)))ߊ7c!T$ (sQ%H88)Eg[KyV[Mע]Gr$AZQsa:=Sx# ^{vBh<*68}~^{n-@N#ڈa-! wq\]FЫ $V^~rE N8S$&C P5li)c.RS:e%(at\`RxFL@7e:%*-& IbOϯl>WŴ%=Zec}~M藐s׏jlŰw(S'',OUQJmhQ@Bq<(vG98!A+Vb5mzaWI( JCe;Q?wIа.ȸh6NoV7śi:75IșCmBH{wi'653G Is&ػoS,K6A ÷Y(ךY +vPQLڳ!.eks4R36ziO4U68C+EąLO>[DtJ;z2悊3.9G}ϛ3N e>|LQǞ{zU&XQW\s2*Ǯ2 uՕҜu VdU&w&}Օy*SIUޠ2̦Ks2*婨+VSz*Sy^zqF)rxgyC;#WêQo[pz8u%D} L9uURF]Au@3nNH]e Brg\]ejm 2Z7-+t&t}-AXY:j %^, ~+MoO*2LiGK -kʬp7|\Htp_/dp)s{Ss~熋U7HsDJ~w,@w0#P9?RyYEK*j<3!w˥HAs7 |B6fDiwţ8^*!~/n- ;ãݔZ@2ZɌyq>sZx菛0̟>{@5CN6b{?BhWc_q=L5Tg c.EᢶhjߥyTx9bsa"7,Hp1$Fg1bkRm0=%$IIj-Y6  4q0wItOZ#UX]ͅIK#j({N`Y<%֣{ާuY96J3Kaq4ެ-w66ɻo鶷a"~oX~1O}FҀwGR~f3VnIu hGWXr@E*3SЖ3T 6Ҟs((҇pKZ?l#%`"͋Ld )ȊfX f,Z2'e*zmR$vj )K$)P5ܢ'cöШ\ u^SHWi L ﴴ SFreVR*52> a]rf?rA%kw|顴J n^7qԇA| *Q7ΜT78,cӇu߆x!-8q3ڥB0.نZs'*c mhbhYn& YQ O h̅l}PKoʗgi]%(vun k,՚PIWISQVBF)y,вA>Y{TQl)0vX9K\Nv{Ta\:WPMYt~yg^{} |aƟ̗X/~ 8| }+7x"\r(?QY>zF.D$mEfꊞnGjNt684Xwxw9|wQݽ񸜝NYr5aGܺcW-'Oo|ɳP+!7] sބQ!g9=|̓m& 83|L5ߺ&0{^20]7}]_k630IہNGP`(9/x$pu{ksWc+=>VWzGOŋ B'hW"RЭ% aReSFXp\ oukg#wq,% ՏjBȾ0 N Q-n/sIt]KRˡFtMOU}{h< CFKo2"?ڜ.NT:>,J(-[m (-Eaو TWHv0#"́QtRRm2E>+ G-Audƨ: Bj%k]aZZMꃑsOBn7]!*ؼlL :*2<5uWqK`l;ue׼kwg\>[Y&FZ:%Iem- C䔱G!!sWx]\Xuj,/lc=ښѭ iP)@^k/Q&[[AܧcP¸AVW n, @:+cgR,rI JABnle-#? ʟ/MZG5 ڈILU8@V"[] S#Y.P[- S +HKSJA70(2h 6[٨FwJ:] 3az̛ <9J42P;༇ӓt6/s찹'N~C &K]~ZvQqa6ߋ] 3ϥ3I) ȳ Onj~YMb 42Ziα8A?ߙ<--ᬚC1+wUp }Vx5,2!Ӽ۸OդQ,\&qrȿqbdf_/Zի9vyՆ׶u G &мTLh'˙{%9ɵgZ\=_kgy~m⛳ӽŁ7WX,^s釽p9W$j?7{br5(n$#`0f05+,D :%/2ޏϗ=sz0=ȯըG]>Q=' S:ku^H}rt)? ~whyHNx7OOxOO|7oowݮ6{:8J#!p ࣡jho9%8S/&>!y͸߯RU4_Y~o܂~Gl\5*6aWlyJt6ɋ[T*UvGipp)*Cl3sgvyT; Wn'ʤj@xhA#YT"jk>sHau>a^z&_gت͆:DUђ ] b'/tkeTҎ={k5\Tq_g}?}F|rix⚯Apu9:(8 lyFŦ݂xV1tiBwv0|A{KnDʛ ɣ2.ŀmEU'5cD3F]Пq_[+Pht= Gk ',0;ew>?͋u f}(t0c B04I4pu4f27#=<l19RfB3NdV @=- cɮs9n}Ikg(ILCϾg2mXɰXGDPO'GziYe^Crp!]+R٥LLb.yzЈ6u@r?j̹$G1:T3>JT5mQMŠ[{ɉjah 5drfN:Q&%/w^:tȹvS mln}n&y|44S|=6'Mwk~Ջ}<|Ђua>ފXZicl:v!҅Trv(~{ |c(-;m1]cUPې(Qu+JVoL]ce'%srCz{͆zw'UBmYVaS@ķbp!JFu"/2砱&J18Z QG& Q{o fg8Fgmo0ZuE瞭|'7 Ů-',Gclaܜ\wbxuT:`.B ]|EW՟G:[Uī;ш%6jHkz_D:6*i2a *O^hӱ=B.-M-DmJ²f T1*3R.AGaTv樔R:1ֺZT ٗ1"X@#Sd[k}F #ߒH,ZjE<$ d]̣<=˥>lwʔ]tRɵ|Jř PaAJ`b4Ex5 Zj'&x9݈|ԩdR6C{"KѢ]4~8)Ɩ*ha3o Z:hڵ.DZlE%3]go=95ۊT(JG2{0)STlɐ(Ssm(iMä5a/ i=]Yh5x֓!-rbm2wz-o@ϜNJ{k#vmzh_x|b.yG96ޝUNDpx$2ϟDNa>y!ǣv FeΧ"=eNY"T}YF\#wpx5W/'r%roY$ ,9JB^d믝+r?|1-^rrբ mw\&?~/l_-ԧr.X G > 4[Pb;VXٺ \k18?g!Et`. M%ǔ`m*%m-::2|2C F=wnU[c-_Q%tN 1plEktEdTrQ w+0bݩcP>IK;dJV>n GufH"$XD]’Wȹ[PI+V4tiz}mٰO6ܰom f^Tw{Vio+nrq;5hd7[qр8:SBkPbNQ@r6 xO9`С֔Ke \ՑԔT+JڈLЗ) UATFb%5f5pXe0]8gl a`]hF]lg \Mduw|y'~A~۟Np]D⌫CɳaT]MBM Q*䆭LDLC3VM|v Fq&<*@`f*{2Pܭabbn;68QkVxMr 1">M%@wRLО\sLJb61'B b! 2 VtEV6Gn-ULbQYjq,ȹ[F}?nqklcSֈ~ԈF㽘gTX| $(V[MD Z28ʪZlkYI[æ"I6fCu^ԦgRI;-SbIiiא[5`#/aU'ыn!\K6ԋ֋4Q/ފс'`*EԳ-um+9DRxZC/n N;6ՇA]D|vy7Yl?U}żNApck%xzUy/nU.Վ\]K)+V6 nXQ mOF|u8){rQL^klUP|\Wys0 3@0lbV5Z[>X!A'LPgΤB=tӡȹǦ٬BO_yUh&j (Z*J 90:. "1d$E}"X rNCF>"eO" $ Erֱ؛M ʁnB!e%*YEDk"͜L"ト9U9E|%V,N.*7FvXSI4r$ s*EЭJEQPԁOeLQƶHƶm5-ΠD#R^LA9x> j Ƭ}9{bwZifDKNx2|ɺ~$UT*p,7øKb8bp9K&T"U{& hP2^U,kͰW^G%bxgˆ/M>-XͥU9n/݂R}-PZ0Zu`4Is:xk8gAIl{g6_AJ=|m#EN!3`gger ѐ$ TeAiyP]w+!Nkqg{K[R6D8+Q:i  QTZƈF"r I}'OuF+aT-}6TRԶe7wwْ7 E9 k_" )lJH"#0:!H26fcL 8$Ml|?iGzrl՘6ld_ւMo>Mm+g~b~m9pGBs*lM5Zr9 W泌V t_j2V5f syh:t*OY)8m85]t1b$_ʴnC'A"Jo Y( c7qSe[U^NXgBN$\'\Lލ9#4";O a(+0i!{aM <&~#)f)%iQw6g LadF ^rߣ 4(/•f; P rkw{7~. va1AXX!U'nq-% rrTT`d>fvK :7rV|2Jhm F-ZFo3o)P*fJ)6 4Xm^z?DqU+5&ۜrC_˯,9ިj\ސ|fN,6ӖwC1y$H9jjϽtZ]Ư&F͏OqyV+Ow6;o@cZL+_:MOuP}1slXO'"aJP OBc;ނ1^r]:dwZgE7[M'ԄbX5w\K* oS"j(e=LLS喊SaUc݅ZȰе'sÆ&Kg8KxZ|P6q\]IBdI9aͦc>F1jig ?9N.x#SbvAh6p˝px6#D+9!$7z9<;^+#0Eg]xh̶#=vza繪'JxzpzOgJTd~R._|+¯-u_'rӅ "&SYM҉pUÇwމ;|YJKe$1> s):e0 [ yw6Z/Ű11Hg? xcmK|cwXi]lnQZ7J̠I_?J6OC=Ҫ)DNJ‹uLzՕo3| <[8y`ؓa Jo^8.gyJ/m)YabYB9f 8+}fb\/^6J<85*$ɺ8n[/@yY^.M m`ޫavG?ԏ_U cq c`QQYϧeMc`GaFxU^J]>>h}Gxo=|Ԩ+wbK_uKn4Cr1;}F5-;K>&|7/G٢k|wZf#U#j jÛ--T#FF KZJ;҈ف5hhhOJ% ]\AH_ \qBc@WFWFIIX K2z_ *"NWrT:'c[!zCWW*yj]t(ꫡ+uϮWGD3Fj7Vvj7fOtJAxJ@Wjv=xGtEq}+Dkx Pr":@b kzDWXjB zxafVBJ]+@I JM{EW7thu׮] ]IK1UB?1>O}Rt(JQ~|19/˭}Tiϫd8$ϧ>T}1rh6MN%PHJ(!Db~v׮5Mg_ާE9Z\"VNӲ!GE-^Aߡa]k^/XgC -]Cy^v∥h7"J!^hP|a"1ю,}!<;Cw>;ΕNQK/2Kj)` s2Y&|T'[͇Nn^9 #-c$A0yL/K $z{b&YUXs۹B?B`p(S:HIyx@ݡݖݧ}fli.3}YZ@̈Rai.1ֲ7tp ]ZNT QR3ҕ V?fv=^1ij7bnpj7vCi:Ϭw+=Cf;p.97tp ]!Z+NWR>#>VѮ}+DkEE (#hvh:]JK@WHWRem kc ί]!Ja:@Th'YjGwV] ])P(]!_\KBWjB:Ԏ(@WIWL5.yWL5:t,{K87/5Q}F6]giDiCdi%d`Kvy_ Z%;DC+ˤ ;?EZtЕgכ#, 0U>*\KW;evCٵpRf2]=)gw\0BB2]+DٵG]= ]1.LJBJ]+zHс8L+ˉ ]!ZANWR!ҕ|!c 35t(] ]Iٴ=W++{c "DtutZ%즎2@ ; ^l uo$F qkfqjtƦ];>g٥%&>dČ '1c!2QRo=ҵC9z_iad% p"θ$mLObmȹO#e -ɲ_?pPՓI?\ j)DV<1-Q61MI/g]eIw#ؾ?a:?\h*|3t4{u-_x=_)nR-[ B]f'ӗˤM yU;9?|~⏟ik `wsw?n'j)ox[vURO88c^$|"ʝ$ÓZ%YNh*s0GU񿝻/Hї4ve*b8ȴfY :*a5qak/r9`lZXD4ь8"s3ސyE5Xǔ䒣-z2] rbH<۪XY2ml@r2> $472adIi)JCeD&aDH=kHjH\Vqe0NMd(3HՃZV֌eIdUPOትfDZxXy{j>kAoxs%F_EΊ#gfůPQ 4 8zOs4ɜ, 2IlVJoX,>Nw{,N˶ɟifQGT ?g8TB8[ֳ~odN~b|҄&~DoEަͦד6X7.a 7|E' яjHYߍfsᏊrMqK $2 ?lGw@x,U S=GAi>DȾu,&c*gL9f#,+\*S]>HB]J\%8mСR%Pz磱t<=9f9 o)uX7k[xV'KR>&h!3iTGTJxI1%/tb%ҩ5Lyt8D}?ܗoJ3JxR#T9 А>e61x>୍a1@ȨUxb{>.gpi2Zv??_.^vIi2['7i>YSocjw_ (H}h}ͷfc 5{26i19?ƶ&f";s7w Xh[Ӱ od`38'fU>Hsh^^B%3 0 h&rn8@iƄ)Տ "PQ#vd Qe LTKdic`yakWZcFCpu +Ņi],+-k3p!k s! +6LfQp-q{ {=ݸìC۳)jsk~^zq47L1)7[[b.͞oj]j~3@6#h!l6=6^nt93ڒlV'ld(NJ6ssZމHZCHƣfĐ* V gmfxȑ8ᙸ%T%Qk !@PR}L.h3Z8ɡLd2t.3[\W煱oKBs\|m!#mc6=g~+mc.?NψmKgRfy[9\|?ʹ\VL9B=ʁFB'++LJ(p#"\AYN&dځa\Cґk&q$5٠!KFa_7?71IEK 1#EIe,VNynٗġCB})mMaa/_EFj>b@G+lJ@F$Y\[T(r: lIcZҎf0~ajɇ֖[֓;ޯKZKi\#|X~88~(:yꗋ͑?_bmZätn m+(k~nXvEƓݓ&(꒘ФO͡MO1ew5[O'U .rUZoϋN7cQZThb)R9=\|_g;랹_{?5MλgΐH4ϥ!`oy#PevN γd)LC6/yS`pzVNd/?*^v)6/3&AD,%ɢ| t:н:Z$(t1`J*1IĔ^8jl51*PV vZ$ rKc6},P.ϹzIO9#xuCZ@[3oWݜ@ݦwuxsRАie:734e(5 :13 ߨP2;ispHQBgr)|d5qp^u0/&Jg2hy!itBEF\xg`FfvJc=Xs )h8=CJD) ^wV0  b6 C2Nf S&0@hfmL))3%L47ƪj> Ck q5,u2-|傓 RBhG:Culf(]hPҭEtH)T9@ڜ(LALO@V貊8oAGXw:T/,j `́T@ZǹTԁR2Ѻz#hn:6 x-_k~~#}m3Π܁QΞ\/7_oil 9x)\9d՘*U|G-yЫ\~ LELژbW̉( Y84)16Jl"ar1+Ƞ۸8kkǿȈY/a ²P)@'ǃ>k!IDɐ+FJsڌ[Ey;6$}ӽMk7zg~h0x={vo}sVݾt5g5W5=#JM6iHQ`hŎVYSj%`HYN`eOPOvF$>ܑNpkk!IɥU#)EBcp[ \]Jg ]VB4}@ ܗTUV,DcT62X%&XioEIsWgH?Ccda윱/a>yX\޾|ՒLBrJN)L瓳l nX^=ߟũM7B_pݦu(W.?,]L #tLɼlҟ, ~LA?ݗUR<=Bd/+mJX`vnC1U2kc' N0?FaTdbyEQ"E-[jV"vw{o>eUxƣV)x(`Nf $|ZzB1 J? f>fuJKQ HW(*Di cP(@WI$ESVN8 HWWS<+4w^; b.P0b~ٜ4>b ?Z3~딚*)|{o *#9g5\⁏)S\u?|_ۥYZ5'&5'&H*s1il3 Չuyywi7%yo{=&RO^s|W_}< S5﷘@۹oi#.7UV[jbjͯo}>v}ۼRPh<^F!ޖ0X6m`QŲeR%O6Iا/Ef2wjُ oޔQBwoRf:BW Ӧi )4W^hfϋJ"na@&dEc~ : қmeA2\{6C+kEm pma `+3:~*?*ȸ?<U_&ez:պh!+I-*dX9w+؜d0Kո\Zc 1>u%p>Q%^ b #(xr_8:y:xkyBֺE =X;z-Nr+\`uSGc Q W]:` fuU«*-:bӏVfޚÝ!j-0ojfc`G#OlP0'l])uU=|b~E]ڐKn[zSۑLmƟP[=6_;͗&ROv)&94Ulʑ4Gn]t}k*mzTg! JkףBYRd|Z~~mr{@mdءC'`K<@aS0qS8`&XNogAH0BQԨm:`k U cJ#68R$#jRz\Tknlöl/ԄB;YOfX/W-&{h~o႖nY0Z9q#/+t1LT$U51t2l䀒X3|%gS 3$\ݝ{\ #_9͞bkN"{)hH &{7ʇjN<ې lj}vsFEco}( .doC'<]url~}oFkBBWn+W3o ׃\xj^#6zGڸ"ՑDuD*xjqP7ʏT|e,:u~Ὲd*S<`s-D,NR+e!SXݕmR,\(wp)Bd*ƌƁZcĜE NS}/g)gq/@ZoZ'Lw2]_Eґ50HzU-Ky[w$Q"q`2Om!Kl8 u^rÜ9Q)'ƒEmC^Gr(C;)B :<#uHQt#3$r tVEՠ7PcB)7j<Бap@ -J))$`<3ΝF{$Û`/Iaa|dq TdI-ý3M9 Bb3C}:3gvC6[AC#CXk(-(1V A0\*P|0`= ơ` YL=H{_H.cUtGUEh;W)m~kuo]V(YGzs>O]z +fquOp6.x,=T 0'~+637\tï~_?w?||?v?@̃K` h7_ ps֯f7jx_j6Mmޫl~xཷTQ([DߍKE&)p& Z ]G>W6,'2ezDO˨ L|A! d xsm%0X'gՓ8zzCNDx* iSɔ' 9|FcbJj*cthCMyyz&N86?_G" $]4o%Lg]2Y, uZޯӈ_gד=]m ͉;aG}/@)]6gčmN.k4:9|@OaörF~0n$r1&TnNlX\` N<]8X{»U1H##sab2 DIҀ1(8␤;K-qyG=GY/<ϚK99[37pStAJQ@ĨPA CbX*TP w Õw~6[r56'r;emZ'uJq.͹C] Dqg»\κޕ*.@û^Lxr J"|t @G#GVat Ҏ ;ЕCXv0A3tJhk;]JXOW`v ;DWXHW .]VUBeOWHQcUQW*tPj++&CtKJpUg*ռt()=]B :DWX* vZzS{B{zt%fK;#䔋OlTΫӳ|FqhcY ? ?dz6zsS}Psiʮ -%C4 1՝W4%ReBIzkihXwp1G4&Kv枮^])E9tȮUB+p*T=]FJHD<\MBWvJ(%C =C'XiytN1G]##ZvՎ(ylW mOW=]=eN!+J.U;CW l=]%t hNu 3:CW .]HQJ++;hp̾t9UB{$(GXr!J}a7UBZ/]%^#] 뒩SL;CW AEW m+@P/]JL[z> aff"P@:*s38:zQb4nvZVV4\yk5|\]a>/_ړ{<<ʝr_ Z0Ue%1t2 Tb~vnSQM(/fe,:u~ῈýQ}{}b0(Ɖិ`2"ҮVDB٪qKbҹVpb#V(% ?{bdMg")6gF\s6;˜kd^#͎T{SLrD 6WICyENXy?ϻP׉A^r $g\Kr1h5;4[")e!3WvTLB[b^eAa:U:Ihi;]%-KaJc*]`fNpiG~rwOW$բKҕ Dt%9s֟I(|%=]`Zj-~tr٢x鞪(f0mS=|a`Re M;}%ՄTt]ˤXr[8i޵Ƒ#_)ul h,ؗy~Y`-XVR^}KN]ve+Z *Lɦm'7N-?\.IAwwƝvsv|W/[|]ݮiWo1zPus|}dzIS9~t}OFGB?88kੳ?SGg&go?>;tQ;dWY_#P:uʾ~dXP*y~DL g?o??kA2Cξɑ?H;ǖ4J2F`jTu&#;rlg \d? Oޟr|>s?ćGhs$gIHWuup99"uW#гn*d>eKIYdVL`1\P:Q5fOBJ2NiO9W]PVWե?Qlڠ;qAqVPc#=3٤SW|&˭91:DOHnN"TrZ(\ #s#f;^)Ifvhѵh\6}8ޭ-5k|HydMs85z-pO#Pd0v!1-.oL.蘸pQ-^rJx hB0sd^DzY+EGhvoح khѡtҔsPaf}|Bȥa1I{'mv( V!(>:D!?@#6ȏ$wm۾?I}d|Ht-őYnj(`ɘu!g5} ԜI4fU!ZtM%%jH%%QhnbP9\VTOpWc[M1ɇ%]DDNE[RHHHI?x~4fI^TBQZZrՐR<@=%#Bb !cMI\=D@$X̧ᛳ)eP} TЇUW)6QO )ܔ 3HT'*Ht=KF Ȏ ў6d#oG0Lc*yˌBi`ES }FnQxW 5h NAkah:^vδi樌5 ('έZx]] e<&*!kBhcnue+y ,&NCqkY6XWϺWgnM Qx2vUOU#)ٰ֞1*T.JvdZ[ELN=R `; nQ!\5"%V\u `6+(\d @8׀T%&CijdLB5Xvl:!L% Wћ+dW40w\͐jPoB+"X7(c)(dk $4( ""*f"u eJukzD9 qN0t AC\oS/%J adWD5 R r ֑L_ J 7PP;U )ti7 U B=ʃ#%d~_63 RUW.#{$uYch 1Jui`rTSn yCH T&%Eࠄ2zN Yh"1۽BQ=r+ZQCk>84iZȠ xG~CG{ՋqcŬsDr2|bL±yUt"B!N&mvL.'ηl=5o/k+UЍ3[ˮtڦO&xs:p4(6BϛБuJ`"gIW=$B+uT2: !'i֣YΈ A9AJ$rAVW2pȚ[CLcx L}辬$kdL5et<o 7tXTu~V% 9ՠjK-"V1j=xaPMRuY+ QH#]6~~99{]@,M%C.z*ce1+tmi,!z4%ny1k"!%:]%@_>1gmG]I0SQA5͚%l-ѧ!)I+y rN1^!B ;h ΪmB[v#h5^P 2Б5;kOEYIb&&UX(ͮɀ!JP CEUMEì2,L]%1#, )خ(.j^ 4nKct֞Ewih52+io kNvzrDM iorAVe"H6 W| F `jU;R@4Xq0{ 7t9?^`tݦ=?ޜ+u2IvxA[.n[m#FOʔ=֡8m˿k'UEŨն[S^k t/ y"<4z&fcdgrtѰMʌؓXAIE!9)lz@'3Uw[[:)Jjd*fQ2uYڂ4f( =-`-y(`cW,XB|^,w`E^H"6J1N:iH& 9)E bX0j);Zƈi87F1 w01t\:'X:W!hSQYǨZ054ihc137+VjE5kփ*TJm3ikX*lFb\Ρ~]@Mƛւ +w6UAi Y'Xr̦i/zBLzRčt4YJFD5SQzB%a9#t;^xx@A4Tm\ *w-8 b!C1+5P /oBE!&MSJ,3wR@Ւ0[ Y\ yFAy58FeAAh56B|7{qf %͚)5:CQ n}Bq_n>5]sm Uk?4 ]/؛K~>VR>AK:9;kfvuѤϖ߬k;o? ȸFqk9JoG^/(B4ϱpaqΎO8Z}*|iusڶ0ȧ]N][OƯm:z9*5#UAtUKMz`XN86RR3.@$N{߾z.xq{9葮zFNoGt}Yt8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'С:94 C$'F@8׻8-p>x'X@ ĬX@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $Ng\XuZЙ@c[- @@i8ȍU qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8#u&{''W;R+q=G'P:*qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8[wn_ջ?.}p} $ K2.0bK@s0.}1 ^ ]VYjdHW:ZxAtbj6C2FgHW&FjX~r/<ہ6S,<9ҕU^]]] a1tN|18P#t )GumtZՀkRj7NWezteT롿q-# uZ]~7nn^|cV5=H(lxkEo`pcפWJi_zrnOiIr/\B?x(o3 M?n>] w9oXߧ@yceQUlAt5n\L PNW%ЕJkVG7{\ahzwJ ]=vi_T +vCWnigB8tJb]i ]= wUO@yhkWBW߅,\] A-K+tt5P%t 銍,8pY ] ߚ\NW@Is+kQ ri1kW9xu5P޸}!t|3_~Xy:yq7aK|ᴾDG]nd] |鴌Yeu{|ne̴?i7lJ?[*9nzFu%lO ^w ^t]bP̽{}ۆpbwm{^Aq.=*\uۮ?֨SĒB&s#xӴn޹# z93h*^ a6,&y %%-,炃߷KǶѲ,7`&[I6ɚWPXMpI.Y#jٴy]UsUYk'x:8 x> ߩf!wiq]Hf*LU[㤞mk{t5mw`Zm\ҕPn0赠v։Z`\3ʴ>i1L]ZgE+N3fqI_0eU/uS [qi1D +4eZLu 3q=Jڧ>aw]1etez;R#t58(4ܮwKŎƮ(}fA+[tЪNaKҀ;# IΨuŔQt]:jA"`]1noh]WD ]PWֹ ]YbtŸRtŴ!SOj/ G'FW]1)t+`$]0tO* |RI0w]1)>ʑZ#י%hѕUPȂӫ8?k4fGf+KӌkM3-BfJ{i TC98׋zmsSZ,ꡮB>$]0 z#ʸZ#ʔm!^t]୒F+%FWLko]1%ڢg+زazm U0u;֙F *2$ Z:Zo ]1p1"4ڧγ]UueW銀ubtE)bڮ%QZEW=(cbtEZ)bZ27vN8"\t]WL誇r Is{K MaEfH dIIK7t˝re#fjf&Jd]n*򦼒: ph)bZ=3SҺz>-GPߪ-) z_.t<-&w+2ftEWzm h+=,},]1C)brQ:㋮z+c3x#+5JTbV]EWFtFxkKWZuŔ1]PW+ uEN 2-bb܆ڋ++F9cW]1m̾uEF>:VUMU0G 6NM(7h M`%(Z1fZor4SNp/5i+ btŸKF(ڦ? %p0FĴgF)s5j/Jz#QΘj-f2%W GWn˪w#twS|4ܮ3њ^1Q^1]VvC.$`]1J=mbJz+<8I"`4A]1mй(rEW=ԕ*+HW U.h1+ҕUuWtE]1RtŴO uw]XtC]w.(Ҁz%FWLP;SBj8Y]r[h $%)LњR 81 dJIh.21 MZuXS`⨔JFTļ O 癭@ՕV ti D)[7@M)7 Do)^,?z5쨩fʑ3áRFo05AZ Zx(](HW L-d(fZrS]RW3ZoJoƕ3Wi}+ejau銀ʙ ͸i+t]=]-ޏS-ݖ5A-׋i3N@2/V -/2^4iwë<Q=ׄl˗t_..|.7v h}`1cVw 2|1˛9y`6ݥH ܝ|S`]S T7W?jh8jB+96m|L7]}<ѹf˓=xp j&D䃓ÿjoGBT0azͨ__ ^ѺFog_.۷ۭn{m;mי?AgOzq]ݧv=DkMG(s'|i=u!A"@t J]15)Q]PWVc ](FWh|QbJEW=(PRgxx:-EWL im_t] 1 xSJl,i.{]1ei^*HW.:%FWLk]bJ,CԕvʜVX5tVۗk0J 4Ci(EL}fX^1QA{$* +E+_WLmUuǶ㏧+r3i-+D_tlt0}g0 8vLu4Z( MhHU(zhk㵑+V7j)"Z4ʠz+cdoX0Z%FWbtŴ>+mQ^te5,>JcqJƐ*ꏮ:@ FW^}(+ꡮF4FMpRtŴ3+cW}ԕӠZc+I*ZL^\M\>$E/fpW~/]Ra %-DgiJ^?&_߯ϵ@}3a7S{GU󨚞R"[qi?ÇA1QY}kC}bZǜ7yIs srv8?90FC1=>鷺y4R͟g8~tztq,.//t<a[׃CPިpkjGAZ ǫ㛭y}%)*X:O9G'tWU덧_ATÇ|ԮN\l7q{h>7W9A>4#i'1RIxILi+><S>혂;ZI~ޤQ\P:<iv{XTT+ymB'm}_%iW|49w5hk#{ΤIx="<䰺wy?.aISx z=θV̄sE{{)^rWWzr2+*ϗ?Zln˨s~nj "ѬTL~]_ǻvV!0iEʔw0c\ޱBax=j~ӛJWP Q;J4w::h^1ϦWG%T(GA!9}tVhUQⓏ0ώ\>Ȯ$htĈcuLB A1= FO~'ws%xE1dzxxn=+j]N9:kQH ! 3]l-*qzԠJsN4"jG4"@h3"(fg21e(;f<eǑBw4`iͩShQ&Qv{ EWzO?"`D%FW]1mW/(Y+㌳Z8t]< i &QFz+먊 ]VN)bڮfQPtC]*IZ 2RtŴ(]Uu+/EWL w誇rcV4XiSt<@CrV[7gO(1AvVt6d f{iV(HW @7X)"ZbbJS+S ]0(9WN'iJY9+uN֫uŔu|tjG5RфnqC/qc4hH yD]=uJA"`R+ƵNcbJz+C'HW)w]1nciQ+tXtC]2^iAb=lXbhhS7]wSZ(ꡮP9c@b\izbz+ҒZW ,HWĴWv+4誏ҡ.(ô+ۅ* YH0ZUӵg5T|uѕ}ڈTmu=Ts33X5Ggw ^9w)Xhm 4#|'ՒjjN_pq= o@ XW3 j N"ݏh&5j(]6uuz4N[m*eU]&UP)?8I=c5&n+5Q6}Ӎ:Y|M1X9zf3Z#|ma@B&N3cYPNA2<$cF͎%}7lLnx+qGǟ?tZ-s &/=m#W?nJOO?ߔ)8cxRE5 $mr|\=wms|Mm; **~J@ˆvZk:Q-ubG5RXF34 3n3#h~)3죮h ]p{uŸZ|wuŔmXXtUGPߪU]wp@Մi!3]]颫V6$] FW +2e+cXM]nPZBbJ EW=ԕfov{-GWki]WL鋮z+PDI" VJuEEW}Utڜx5;_-ygZq$ٴs}y`zY;S뚉tx|UGf۟\]C *j>ya$/®BO[aӔ [RM,Wo_t鷺j :h?ӣfqy~8==0Cͧ'Zl@>W7[9adS:4Bt'zZO^ EQSgnM}|qN>"_wEGR/d2]N; ~>r"y -gYM_΅/tӦGog5il0_.N(rOU2|1U5]2×U ZrFmsqw*6lm_5ǣzG|7ɰ\,V;chM 5=st-'{] .o?V\o j8\>|r~ZݚVZ~==v::=A;gлٻ涍dWP|4[lR}pUC4IqRg@JIY3TDR@ߦFw}PrGs.80FOF@cj5KI%^Wxܔs >Jw|OBdᦿJʣVf}J}+{!IO=y| ,OyF䯢TaF³'A~z1_Bgfu6`60oI_@iXS4;}/f55՜.o"IT ^mn k[s[58Ր Th4""! hJtu  /:Ӕ+[Fɒ}YxԔ'j$WU0SSZ_tun'pYm*N8 I?E1T3%/p?ƏTU"?^oo>- Ye`7K`?ï[?36~flql.-'88-1#INhNdDKI"2UJ^ ʅV9O 40浭@^DLrrmKBotkClSb\XWEv $uG'g?W|_.6ˠA)1i;MU=x7tnSL IBD<ňW6"Z!m!PUbfJ3R{]"QݭK\#In !?#S~u&C԰^ m xd/p1ُ2v8b%O=e2}g/u2>jY ^;NVϣX^kX,0G3eY% 0~vױcRup3nуg^d,ؠ`, bdNSzBTwqP!3%K3ڛB8zIa7=<䟺ya :,GCKB ^bV&B$yUE$멤$Q@ i>n$>+hb/3!A^C!:ξ6̺eNx0-{~"ό­ܬ rx`dpex/U3)L}D0^D!|)I:}>]@B6Sܓ_4.a)vSA*=5+f{R5޺u׎A/_uG/iQCM\ጟ:gMo_zKe:G!ӛ@T{ʀjn]zt+CYI*MtQ0(ITt*.3 k$86j{?)Bف"wYn4#費yhbh$ٯ?0,#V`k77Zsپl\ހə T%iQbSnXiY"ȟ˜\nDDzCr#rm{%#yBG! s=4}DG~Ke>^fZ/'̴6%σ/ ?vZwf:IjvI}pm z4N3JjEfEٛıc%q5+@c83߇r95ɊaWOhŽ,Z3^thc4pm?TG=9MƱ%#,c̱ŽLV$+ ӟKz"x`NɅnes LlW0}I?E8ɘıݱ@ ʽ ~j_r=r$kэ:0  }́츟3SV: [/"}8fęJ(#-TB(˵_sF5"2S: AD&sn qlw'+՗ˋ/zS3D[D+|t7b/8餁3#bk5b^"/-28ɠxiAE8vӫ@;Y}bS(8Ӏ̇Q%,iM K"ۙ'7?|->{]dr9E=UgZюQ_+zc$9鑟"ϒlĂBHWYFHV֘,cڌ={qΈ Ymv׎3a&'EgKdl>U[5ToEvcz? ` ڹ^z7Zfi%Y(@E%P(z7y{=pS'+~&l5Hv\gWL6nTs3R&E,0PJkQ.$HhjIǟEAO:Au|23v_>)nf% ފr7+j-.d1,gSj`\`D#`VӘˍ>lkE}v?J]jt\bO鯵<Ѳ[}ٹGqx=הr#W[OWR}@j@;mٔe/Y%d"<0rp#Eevhc-'dm̤Φ1qAQ??F$?cBPЦ7o(@OpOճU,{;òom cmϢh;i;I}j=`^.ʦz}mASLۼM4:`S{W]BjW6?VQ []Dp@E>v+A4Mh_@3Ұ",יe ّrٛ&@Rua3ܗC4!RYr\n&9cyaEXT|3(KbaS5zN8Yx~ݺձqcVƭnusl9ǂkQi+9d9ȁNu ))ƴ,BD?n^[^?/("*'6=Ef#>>7򷵽78Y_E<;ցaP/m[46Hil q)խԉI&9$I"Yy&PbȒ,Yd.s5/u(:Bo~uHew|v~lvnY& )pK,3gI*Pr/JRoRgQu:{hfmN߷A SC ?)M7]bZDJ$r$3B(338S*Y)6e[b['|ޯC&O-T[rR.Ry8~8:X%MӼD2O3`*qtNYh$"(%XKk32lįO!@)٣D ZӲL9WKW!:-pP&qY jcHI$)Iy$ZZocD ֱDž3'{gq1Kl~6_Z$XwSml`d>{o5b+GGHAEp]߾|1;ͺ~X珽arTRg3{ 'ӆ8ܭ-`02۟Oi)V c) [\E2lxJ2OnW=ly~S,4}7yuW(z3_Xj7+詏C#4uRTJUƷ|>JQs<'vFR5I3B 9e*N߫ cS)s$Z:jmܿ͹ʁųz&x4ǔ/q? c?D@* !VI[&9,Py;=ah#f[J8DmImZ ЫIU+Y:Lb:˫L%M&**og# r h^&]\3e­؏lىzycZ8,}dOz<%K:5m>Q3L4e0txŁ,eJfi; Sxq%!g9 `Z{&30}Y~#\)TOG-vY}9I I+s1p0WmQL; LBBCS=xUINÁAGJj+^_еHVbPwa-X< WFZϜ#{:{a/ &"ZmjwҒ%aDD;hRap}>x='*c퇐̶ĪuD0ԤpQقC<}Y4 M d2[̗xLi k-_@NbȈs5‚졏=l'PZm=QZlj gJz_)ifV/}+Ta=hT@CĉeDZH1e.vd(/T' !W0o.Ͻ(8"9/vX!m:+)| (Œj\xv|0SJ3$3ʏ"Ib_RB fJ{(}S aL"> G.+J=[~721&4oz+#(5L)v^J4o%V?L 4W9{Я^;HŴ/hXAi^9{ײyJqكBF0i'LCC:8kQ,:edaD.c}xcAIH+eXJ@p^Sb~O_P{?]v- fI/(5)~}?0ˇ,,x2;q `3K&Of?j3a, kEu8/ ɇ",릸_He8`_7?ldz JV,.i0`0,f"BXr||MfrE++^P8JF&{A }iu V\iB=K/Sx}e;UsVglYk[k?e52?~e-ˋOrs]9=>MʿVrY.~nn$ԗ)HBK*dWhml$ ~vr^/HSV@@m=XF$ BႊL4當MlT8TEq;%Z&%XgР luZE8uZ;Č:AHxoJE4ȀBJ ,g0+Ҕ1JHP)c7ԀevK8>kg(j2y^Rr pB@K~@Ā2+ v:2nBXu"֫@&3IJ@ͅ,U05yaVzzqm6%*ZIKgEY ǁq u3z4Q8'$HF48 ]m_;s6 3256 +t$@).SX Ry5@߾^s(r~j^l_ChΫY(\_X"ZǰgSBcߧ d4t7`tYƔ-*ht/<|PꌃYrafWZXURiQ`{Zaq^׌f9,ԊRz^ф5;YIh>d%1lNqgFw,wyJ~}wS~Е0uيOX I´N G^d1@%(9Ќ`.e y/lB+HInJi!1@FBn0 @,f4F(!@+oSАԨnu%+40r8?c /`Dovd2/Ț"9{F5sYc|#𼳀Niv󂒹\trkB~S: YqXcpYGTlGR('ۣ˲@{:-QMegl; PϹDDa! {Rҡ3%?w0r7[,<[޹1Xʚ-kKz/=F^Ђ!$YhJ-3Z g˔BKe5rYK;CӹOwX2 @FZZz(̤^cSj8c4@eMOO"L]pl p%Āyգإ"c , P1:Y} _Hq*A`졜mtO^J1(laXԁ_larQb$F3ӟ.߱$z]杛qWÏ$=dU6;$R *7d cmzQ1@5z#*DzCJe $vG[^+lb~pL_P.V7 >yx]a7GR&Y ad{Dں}|Ǐpk Y@SaryTOe}](uϕJ] ΔerqLIKQB  dLse!9ߟ[bċ¢淏NuY]v,v /Xyʡ*nNCc'h;ָHl)Kꗾo(>^96;^! >Yl}_j Tz@0:إBSz`qsk ]J:C~Sg;Fۤ}6/A_#1u˷e[*~{aw;lL?x'S&?t_ w1*7@l@ ė),MbDp( m@҂3n @ { = I|*. *}j>}v0āY !&$|l۰R)^P x4;LnY>{(9m%^̿;e~+TTh"hO:"+yc Ճ"qrEpXLV4=T4rҸ֛ؒ*!g-)x[{ͦ_eՅ̵f2`̞yE0 %Of?ѪǢ`XBXY>.*="d<81X?=g@ҋ6" t[WՅCq-l0?ݎg/B2;Џ 8$mU3`#K!x)ț%5ί<|(&0VҥYc "(x64HДُ^QSƒFN-W #o-L(L wwpT[P0@/WJQf(՘P $gLv:)4ʃ掅:o'B&Bu|szU<záp$8b TuV0vL?DaѼ8îS8O\(/w˯s|!]Bc\cEi9X|Ys#2C]@bgd9[ ~SD%9 }t)"-j8.\9id&\}޿lk$4B$waF\W& 5s$5}H,P`^j/n^Sc\`=Ÿt]NI#q 3 b JVD0ݎU.5L-C>?Ydɲkh5y,@2DHq6YAq!7kles\jM/7PqY3}РslZӱ0V[xD C 2â'}a&4KҠl`<.9sp/Tx_ 6':ָ8eӼ1_#tor2[>(#5zN׹VXmك䂒<){3Ǥ;k f@ kbEQq\ }cI&/;trcX]p~6M[N=M[C)tFNV!ERô*$q$B4Me}hN { =! xڲeI&飽*%RQdQFիw;pW7Th?1@+ X=Kk1j2GW%2_Xesos\1֝Nkp!nrG/(XIMF"|l]iJ**ҜKmܕ:5M2mtY8|禈 !aճ2IBT0l51~ ~MX"S1VTCiF*/*-L9i\"}yϷBR %E0 -mԁj )z?z/_Yb q˷${6bXn9*][!EaMýIM[_RSb3W ?OAg~}vWVzkpfrQ9kw} Ӎn('d?"2t.#=Mv V8hrQ\O{ fPu ĞHVV^[|Ҷ2OYY;RRWh=(ж'ׯ{/ORz沢B V:ⵏZQ.ڦs<ҳjX뚼4e/gס1nF #m]{Ω8j,SС w!$b{Ͳ'5-VOq$"|eB|$;/ŚP[/P@F;,a KHL0 nbN 6믦 0ĉam-) !ȡ/# $pεs(\&M漐bv4ݻ/Z}mV\iK(A,&& i_/v2(R 2 '^q`2O@yY>K:0md#R|p:4* tow R1l:KS!\Ǿ/c[`b| [$tqzi`K~GKcm[=y:VI+s+,kۤ걜w$'O=)D Z'u%Y{5f7ؤ.nsD l/ɠw{go4GGS07 K~#%8vL=k&YO{(ĭgX]4,DnBe@ w}A¾Q- "؅ƹ̠ۗ"zҀmU?P؎ZTm;RqԶp ݮ8i[,:ieN:Iɝ|M:c)bUsҡrlM(;{>rb/#<s"E}#hJ8jAٔLD!\-Q)CaV=eLuF˴VhfsmNFڵ_VS#&ѩFS̽:mvNmq*>GyA A4#Q6d~2.q&1 &>TA=4]O 'nd+HŽ#6D(ȱ@q`jzb CW G`k%Ekhb]2{5X.{p8@5MkT1F027#dO[Wؾ du~[EZ~Ls:~8:o| v7h@ͮkprp_ߣ_1J! y`֫V uXC[1.n` 8zS3XI)2fxuj $Qo &[_j[6ANAۧ&C_.2GAoM03GQ `J"7cAa$@9oE#s8 a*EǸK:ipm:c5xhT=Ljm Ns#󇅃;5@h vDe <Xš=`+s/`2654޼V6ۻtC}Hƹ7It~?HGڥ_IvsmjbӓkT«\r)%t=lqv]&Lٿ%x}g4}Tߌ*2O'n8x9 3lJ(7%~Wr R.NF  ###Eƕh_2Lhns7 hbtl%ԋ}|gkD̝iɒkH%C1p<7z +ت6f0(tq>a**R.,z e R z kkWA*82O%1`ŝ_6Rip2)WZ2]C:N l4ymB*w , 3a|Phxe⊳,*܄b7V`&E^96M, ~ӭ4+g2.AcWaY{ba6e_&=E>4*%='+1,XQ;&ێw߳-Yi]FG(, x!>o&|ci?Bn(Z˅@+qd`uA+SA$DDAn؞vV}h+`_/J2wO=-U8n|$F-jGr)@r*!82J>O36K%ՖA<,~XE*`Ip{ـY jPQ 3˂,@ @kuJ`QB|b*`,(0["CPl:d\S $.XA9[ÌeՂh#zjUXϴ ҡSxԠ Upȸz2o0q1}SgZKO{mJ!7v,?kA~{7_(rv^ԐٟB0NaH HkACbO0Npz=8dPuHtp&nNބGg5|FO/ P0ǧPkVDsJtpRϸ~Z$ StۜXY4~ ~Mvhx _ehkJP\*<%W*LkX? "* 9Jt"]wF 8cHq nOd“Wi5 Fof "NNjDﭦT{ } nMb&IrW"I"kZ1,`Z?&a!)SkzYp+¸ހrrr#>7`G 5Kk 8o9qT5#JC}q0 Rb\oր8B1Y0$;#*5~0},s0Da<KMoGO_?y4 gnP&-v7p58gT.㗂 Iر/XשL_8A;5(f S 5r #jI&00&>*`| w!:<-Z6`-Գ]C鏕c҅rGGi*Sٻ(8<"E䌭kZ-?>G3᥿ !M~46boeG 桯 P+FN! TZ&+9=LvP6 ; ~jD7[56$\id䘄(*AbdKϙ`~RC5CVi^p[>ZI+2ϭVHҪ`חf T4 '&&%P> BI_`Y \*+?{`=%##ݱܲWMZIt/]2Fuz%bsT*"+:T̿,>gTCf6gFWS?}Oe~]Lw>rsd = 1%pSGO_@L^aAR "_kD181cS3_WZkSRg󮳼8V'Y?X:-&SÔ7q/t_eM7^4}#oӚi.'ox3Tx„lnR~YEL+BKW2'R6{΋_45}@ozpD&[<`W~EoB7S!X!;@ CLX3 180R~ %C$y~1|n?nf %e?0)[ϙGeֳȋ)| szt@_7+QWutcIQ\_9ru$!3r!Jm˟G'Ʉs1ׂEF!v 5~m&Ϟ\J%@N$ $ `ڔ!f>}Z,(-\#~kok:7 -6g ]HR(E rVO6 XA9;C^pn|jhBtz$'2e&@@\i-2>HbS% Q\ծcQ ˒ؚ9=߯n´ xQpZ"Cl/CAU@a/ "BPx躹9EIji-( R\M{U4 C%o[v̏}9-2IL)OX$\iTXl񕄊(Um@ε fx1V &׆zd"ͻٱPikmE藝`u4L:`03$``"-KĶPdZ"Ȣd}\Sts38]Dٌ=#zcmJäf1Gg3k"^ [L,T2z/_Q+~N H6șf&)H'*#`a.Jvꊺ"+]x3[y܋*1p=n^f_iRG QQ3P`~In5,IdA So4Be$An1/"> .m=Bs눙ӆzjxev| uڠTKP, I(b{dh(azJ{Eh9.2[Ȅ:O8n@Ii `!,71Ыp؄T6dԨk_Ee\$b1!|Kj}T@.dD:E}FG6$c\G =< vm|5Eי/ø_B5sQ':͵jDVYז,1h %%S֚0 V B}Dhe\5=)Pc{~[k'T 281ذ 'Ƥp!͓Ls~\mDQ*JԀL4A$5s)q+wj;F$m,Q^\d;R8?UJu?&N 5!ǰ3PL ZY [_q~7(/2=ǼL"-bVzDK=`5ֹ[ G7]̨n>) A\C;w,Uh muJ୑qɼUúOWuOm1tq;M{'0="3K298(\ Lp8A6))^&Nk[K뜘*m2M"}:k=E;d֚zDdXàW ~w\68)52QTj. h.q52d \YNΖlQZp)gXI?+hM/>wS9&`EFƐI)<=&㥺kbc-h z=8 _]l(٥P](-hnTE p MĽLF)xVC-C8L>J}Ou6{nհeX|#dgOߺ,,.%VcܧU(<+*h);8\*ΨDpH+q"rO)*!JŒoF䊪1(g& %.cj0geaʕųz!;0.ōTyp# 03ʶwN2搣Tɔ6ƒj!XqӵNs:t: '6g %ӽSH!&Hϐw6HttqnE9r52=}M>1.52B6yn6zxDZ#İ^=2)]%m9&n~3:oU#J#xXu&#DPCÿѯ8DP d\%H}$0Y|q|>:{>Fϳ7}i;A[Nz}ѧ΃of+of|=W֍|œ+Ъ?i)LhKêȟtjhwhot2]:5h D 3Dx, #k[ 9p }U8/>'7cs`?0uy<؝M/P9M &y@]56̗ %jgHhe;__ VnoRO;3fyer T/{Az_9!7{왏|6K_|mP38}f@HɅXݛ#%jaNCIqOGhmL=#걲r?ʉ.x \0&2IufTbSbFIoutŴ:‘3rR ^<[B #mDy#_bħ/,ЌFc8E8E@h#|5=}7:%8M}Yl3\9l 9qAp=zDo>v"z* zNpo9MrʯpS U=)I5Z8K#Rirs4@̤"E8cpt@Bdr?m`˾F{jH՗!yB8!Apʍ\\|J[cݗ0I_%t+g<bwXCxSoepq~4W:^uŴ)yp'BgF 0JOK3q2j439Uhrǻ%R7ת,dB1}=Uhi0w^Ph*t>ϡٗhOpu=I}5t)3\{ Kյz 1Fc~mIKDv'ZrP}.]'+ј =ބaO5kZ}ؼ9h4QC̀M͡HK/~al2CRpaH+J.6Q#W/񈽟W:VZi242)U^]m`n:'_w Na۲P"w/8"vt%1=Q䔙FNH>6c`l>8Y tNug!?p#ev Jʜ52W+TwoPb%, _O&iLiv??OFٷy~_v_|kEXcmh!>Q0z<7R\#cUšɰW%[>-ibg9/616}8{ Eh6MOhI8yyW߁M#J tyE)(#eKٚ/ۤGHdIbPBe66R]­ԠgKFyEr(;zӓ;zN]+"Le!ۯӛ]7 8Z]}3)R$ f@`oTLS-s>@ pHn |tR`V(,¶bԾZIԺ5LJ盙!/1Վ| 2=-hBLiRh*R@E[ gmS8h6O7H9et}TwM8)\Hjdk# b`R%ukOC3cVJZЂaV$b[wdbQT`b,12LY,m,%"Km~^nO?mQv4G ONǼ0۾LFRܒ/e.W*i-=lu8q\PtwE,ꢨEaqKUfbt)M ^8}g;Ko8](jd{>N$sݭْ1lO )k-X%kfQXoE /$H% I<ff]fH*\H Iʷe<.A52izm.QYzASX/*,JYl3 |&% c`&&ԀQňNbpEJY ʼnRno+dVGj1܃8aR%0g5:e$XdQL dep(Tf8f$1HƂ(3:Z?fw$&7I̅ͽFƻw\>ɥwb% JZ5C=`zh[9Sxo'Z_ T*߱^CѠ#J;+c>> Z]慶<`&=ײ+?O++fw0 ,?>O *TX7nPٺ {ٻ6dW]Ud@ J|JT~IQΈC95˜!9S]]}5uѳ o둩|SdL/-2ҽ须ltXQj&SZDNIJUf̞.BVt7PuW&c8\h[@L$.e<}_D.~*Qdi.ѨAMRo"B!Z,>Ѡ~c#c) 9W=R 񭫍 4xPZDĹ[@eQq\z'3$UHѫFS8 FM \Xq]&M7VQtR-wZ2$#a ʂP7l05UUd*XV.3M(Jsd[`\ǢO&' Շ豱.Pʴ}j4ֵ Hš$Dh VTq%ܚ"zMbS(K3aX9^IMS5D2V*&6 ƺG7],>>W~w|s֩ɰ#iZ'uj+ǹ}?kz-U~泇~3xfSWI;{$X l?堋=]5/1RCɘ cגZCQvFԲ ʎr*~ 4LdtQ'مH}b3iw}7 ߁AXUppJe%v͍VnbㅁDz*/I],O/'y~^^&O8;],~޴.}P8 ubz"I7ytj3AsT(`MBRS I_&0<-nlB&O5`YETN ڴ"'b.B%E`%8 /MqhM P[k#$+/}˺85|SUz%<3Ύ[WR5/PHf8=% +Ft/vF׫r,IWDxKrVŹ.Υ>!}(M0$:ߊL?.o2ẺtUV߮hޮ^~uRëU9|˲:X~9_Kqġ./ܘlvM]P5vn%eˮ 5tG"\ xO6;G[Gj-Y,5"FHץ~CGf֊Zn{K~sa1Gd\NPXh&݂3_񛽸8TRJ|+_&VzVH5PձN譄 ->!+l{#vR&NX'w6fZ3Bnnf?GD~ʂ ؓi-^~{v4]_gn98DjuTgB:Pwbf߼Go wd*bp1{%+cL}ql}'[N^kLbOˍ\c A*Ֆ T}h[ShR__7Ys.rzz׌w էL^)#̔ڝvOݐ+gg>RY w^Z 3<Ҿ K/0o͆9e&a[[Re꯫H'iCzZ)ĵ(RBЂ-H?.6ľmo)\TwVu$NC̩o-34KPFM,xl| IBsbՊzݥRCZ0{luYQa[lż!E;Pe58J!DHo&PP%OSW>'vI |0X$YzGa.}%oAKÝ J"Qpޣspi.}){ΫLݸ7 T=+?V[5ˆZ k<ՠP#T[*E}d@.CX V_ %+a~ѤJة_vj0|:Q(hT p-q2Uq٥Ӫv$\m/v&Q["Pq5ʍuMqI .֎_4I;~vQ}3vחַvTN)1qe_T%v"AJ1\0+HVF?jx5?B5B!Y}<B50>lUHeȜPױVn+`g/7_U-(E6d{ݛ` ޛ` d;l83̟H\s>n)y#9`;yk-3t}d>HO5T܈gMAb0D*\ƌM@YOD`GWWAP$ǹiOqzq𐌞|4-cIJ`L)"0e<+/Ps*Py0/<ї.P'l|87'L_ DZNcF&Bl,pE%`#m nHpp `BkI2aE\jcTۢOB65߅~P o!x.Nߪ./bzx_CZʤ,vB_JCAo|7g|gCnq䝚TZ7[s|q?^}*{tGaX=0{?}χ+x^xڮY[0DzeqWy3ɈnhAG 8/   l Ys9YbԼ!KkubLhEcŇPL¨Q­aŕ1x*J1jpI+>y JzI\ZAXJ[G ;ގHJɣO SDҗdzMt[Ѓz0"9Daavַlu>(kHOTA4!ٲ +hbq!R&*+Ҝ KM,JOhز`[e>E.r]2*yLÛ>7f.@jP)yHKMUj(FEp12eTs|(W~u!#DҴ@U,O\'k}#JJPե0I$D PDАOo*[UYr>1l*lNnݦ8RRlTeW(S%`RKY(E,"8EW3 r qԡ' n.sE/RCWqUnULYK РiUgTYΤTBUX1[%] THiq|>6ڈ_lDSoD^2Y@o~ԯzX支+Q!~k=>*g~'W=#oH(ڋ)T9Z eA,͙R(g%FQB+ e6s=JS$}ӄvM!ÃG{.͹neN.1Q$s\;Vl] O(똟>1}`ME~ӷ1ח+Pᣤw8ߐG,"yqF')_u=c^)6dʖ>Q ~&K;^—vrҟV2]O ۂp:s:x}.5X|X^, uoW/_yE/\|ջ׾C5E'"l}/}͋_<\!>!}(MfpubuKz;?:|}w 5GxuoZ_z]/ުįV`wckgD9kаݵiHÁ|\]|aREOBB&I@RkPmkX(Z*Ŕ|`}S6Cˋu$rTiFQ-a{N7`s--V "=LmT׎D}B ϸY&`7p  \ap PzCu?Ts ȥ"fRR6c|9ؘzaKut)2e_ݣ8SqJ%gږ8rl$Dا݈}TՖ-ؒv읯_,Rff,ےI*U] 8`wifKpd!-EճMCNiCZfwxðs-O]$QTF#VcptQ&WJHUF10i-Nf$ kœT  #̳o||Ns,"'5 Zo%"eb VVzS9IƭxqVMDF ~a'Zr9y+kyJTPZ+4Rؠbә>S_r-`#~j %c J1}q[km{ \ r횮JIM%`'_ Q*dW~IyֺNI!bc@2gJՖDݕ5Lh'u<肒pg4W "f;4eHkaN$uD*lWɮ !__Aݜnu<%Hm-QP\n:%4z@Pj-n]j Sc}dopYߒo}8&ExF&!e%RJ Vgt,8 {crs+49˘\,vH1Obݢ"Q>Ǹp4dcO2{S̞尧vdpHBIGu䰞eK \PC`Pa f6& -M !>鐲SPˆL3Vm`d""@Hy z ƕkraC0E 7֭Nf۔0#3sompuh((>}δW[~-G򨓇x?K,ɋ~$ɯ7LO;kSYee{w]/9;jHRZwŢ`83~5BJmP-`<Mim˯ooOHe!|swEF"p' :'%B'lVFsu/{1JI5IkEѳ7HMNJ/at7jP;`Kb'lڪbZ3f)4.ֵam/ImPm6!Y(Vv)JtuV& cTu坑oV93jd:h%FQDxvܹ%Y W:^͛C28#bgtWRN9FwPBDv Ɓ7ulo%"~OX~mC~?.e//1-%6[RG փWÉp#$<4&I/"\ġvs}-"H!(Ox{,͑z$_uk+#&3\>}X2|\hR GYWc r,Aݑ.⬰-=fuH\TJDn2(d+-Z(rb,< !j soX=FC34*MڤMqli܉%Sh!}Xj`Iah0?Xᳱ$Wi,9g z[iԐ^R[:7Ikm}8Y໷m}>: |U5vgg `\NUȃJ0kEג\#H^D]G !bLXrAIZ!P*zVyV2 K1(_:{Np3iy4 3i~&4O22k?J_'ɞbS]هdbn`='_N)-̵!ƃOjy=/3f&B\=ewj^9i*Ŋ|-cIJVAۀ!iJKo@ll%a5nJ*AL1llD*Hr%S&@|x3X *? 5l >3=-8 kՒR,SlɑaKAsKj݄7[OQ(%E={X\A)VM(DJbs܉N?qNtMr^-MnWy;C6x^qFĠLW5)6Z}ˑ K'0mjIޔ777~Į)%'d14~(H=ra},IR޼i1:7Fe\س_,5[".$ :l$ mrG+@%tW &+ j) -I1p-/RZR//T*\aYIQ$b#O#nPPGK6oO-(jzŠ`ѐG%`GjUHGYӵrIŘ/3*s&t|eI 7y}c7A<,3SiMmiMtR/U:)£7ll:ƈˏ]\Tܔ  ES72lW+l<2F<&e{nvjU P}ҏhɵl=?l!TjbU3> sջ 3, -QR|HqMٔJW`iۢ^X֘ܽ<s> *o)b7y-Wu v;Z)JWzĵ8r`hsRT3Na Ku#lG9@xėk6k\O%g5h%-[+}p xkIjzf*ك+wYJv@mPXw"X`Z]^zQ_GU??ZwSKWM>[5 oֲۯL̪(Xoll"-&͇6R"w\܎y2NuM4huKɩ9YT.sHWuޔ_3'sϿzg^e BRdˊh]]wlVoK5 h]ɀ Z8k{^lkbad9 K =_?<5nI7F`;n 2lzdoB)8wrj7g*{I0WBrS>ci:g[ \ԕl 1J=gQmh ϛ)Wf1OG؉QOhxKɩ9#-a:1\[075@'T2kٮ¶C@5s@ uA8lf`"L C[9Ѧ8B\CNU.cc5JN❏ðͮ‹>W)RU J6L-sIA/t4e4^QДKw&y&\h_]ORq=R ]? )7S1&RCeP}Y9heG5z}X \}ӭ޶[75SZ) "wBFQFɳ_t3@´"eL\8DJL8KwY&h7ngmiCg8i1?[1rFkkW=םц^=O Wn;9Ӏߝ\JUB̜޻cQ4;5RO4gZe]FZ|T?)L`8}wqk Z_*ysF"7SS>Tt)%ŷZV ɩIy6 2K.u;B9`u5M}ӻs(QF6$(֔%*\.KFEJ3'g;CAGI[7 dFo<ДHmȉ0RP_ݬ!(95%†*^.z/ w]y:ܥa@g8n;3;^\:?y~D}&G׸.|@l"ʐ:)Mi F_deV%e^0#Jj.u_@fu+AD gk̾fZ#+u ќæ 2C8~ω:ӏh*l2I$scn%~݁ԃ4sknZR~V[^ʍHZOEj wW A:Z%;@& 6L V[uE eD杮4h8Xr;d\5ru$"y29tNA-P,vG9{'8q|Ϣ_rUk|vVwF#AƖE/M< ,enb&TxðjZqY%>WU*.J]Oj&jGTtj)eB7]6r)haaZ5,"7<)֧Ip3mV'fGqsacԸ̸7&ӗjkM҂c s*6aԍ/j_S 6 Q}>$RUw WgEm0rfGR|u]0uJT&û/X&ڃ%\4=dw,Oh v}Ws[߽Fn-ۥϾ;Q2"Cdu o7ѧsg.݊crk<TQ90ٵ-o:)B]ch-h~;ɄxUN(FoSQ;BN)BJ ư%'Gap:a7CD hdy؞r{#]Gtn<N҃A ;'$V7{4'RrwNA%$D3%1] pPX݌y:<#s3c7!ʛyY7L\$xWj^U8$n<r#uMu(X぀,\n!QPi:G}D Gi>iD`*٧B%Rc NFߙ ADǪ;JN9)/P0z5bG# ĺ5{ETEɪa8xC0Li>!y3(M,6F3T޳JwܝhD")q^dhKDi@8 9hĜ, W\?y\\gu8aE%XD`BkzWݷy.Tw|?k4]x~絁q|q0__}zy=4m\*3öyW(ߔҥʮ1vQN'|.ͥڋxMETys%|r7ZYP-퟾[a9{NW%2!F Q'3C54&囊fgwFE Y; D[ XZNí-G#rW H*Vn5AXMܴ3S7<4)h@j}wf$.Y;ofG=<_ƛ' n?M֦Bq_>Qd]gέ zğeLiJbG96ZKkjojOnEo<+t8)ĎDivE[ χŧϗ~ˤF؁5Z qmИz$B^.,z0uгMmN\"U8XN!J9Ym PKZw!^V<#T{ۀ&/>{C*m F=,b@{"ћogcC4-,E>Fq 3w[) d#I.ef\O3qS Ej\Co*J>[88гK-%F8X1?X0Հ'4( jک06H7TZu>7[+9=)fqEro"tF֡OS9gE7 a S՚ Z"׈Ց-%SH oMRhTDr++7zjFLK})-μ˟ I)$ˬDl 4]B4Y QI8 \c-Plo) %2t)+/}RόUQ'[(^Q9UWwDOYP̗ؤA4KuH!: pl* Q.ARK KaںuVtnJbx-Tv`S6bV-<=$ /%*AMF.jEwBhNw>lbȇ'עEFJĽq }zdH B$݃G7&~TgRo#*]`Uy>1c|dz<o7Ӈ e *YDO O{tز] z<[_]O N=R4j~c7rhh!MF== 2df숩bz꾬5TSw ~|04yԯ  1,@O`P3Ľ"ni?ÅX@sDC^;M4쭱l|ᑬy?zg4BtNLF=9.w!?5lXPIӮ \%Hii*;\# p *{Sž}cߴv̓iڧ;r(ĩNbHnwGɉ9gZ{8_!&nwuW?ARΕ`6X 湢DkH1o C gKjPbkw^]U}Tuu [rSN$ԡ}m}ARQ@ߵܝL>߀sqTVI#F0b QGs|NJhv Pvy+onu8@Tk4D+\SkᷡS]oB1™; XVi+K<2F((>ɜdw@Wɽ*4 U )NlɻG':UzhLrͭ%ݣj|rl{Djv=4b# <ΙdCv'Bxj #HTIVsHTgv&]ٲٝ<#5v_xyb5 ]B<}JIdKȩ0;Rfq%F[oiɄ=Ԯ߈{jwN)wqwrq[ yX3;v#v]%v0;Rݷ(Ji-vpJc=ہo) Μّ]bv@#V <ŴEkxj p]yE|[|o1"T8(wC1kmhCiNC30nARQn+qq7Ɇ=J|3ܟ'<{(蒇F'o9&yUV*ONڑ: $mcڀu$<Ԯߴڝ99vI17!6p5Ԯ{Ԏׄ>>`+ߌԎׄ\ZrO9ԮGU s0)m?zjj߈{jwԮ .@҈FWbsnq-}הSvm" s vU̩Qwkyڐ1OZ+ nzjw"4(O9+kmeܡO-=ہoS;Rvy ;%|v{OhݩP;R:*lE Iqi6ivU)Ij$]oު7I#;`|od>_# NUx4b=" Xyh|SqHhaن9kp#{ݣfot P[&<1WJ̳I >e8G_TCq԰sܐ *[I 8` |RvUyV]/NgQ6^GWydv*C&Yhޣӗɓ޿Q G)j~"a<~ukV4S8!M.#<}vVIӟ`> 2Lb)jԠnԴ\R4R7r}|K?3bi83 V\&,_<_*I!C<# V6kE7)Q0q'QMHM:Md,4-ThRDDBX:wB SM3"Ae $ed2iĩI*H 3TZ +[^+%2BD`Ȁ ą+Ywy#5 ]rXq) $s}6w@5Jyjw"ԎԡO߀9IRA$ sDF`>Sf{Κ<CNUUGO[ vP\>R̟<.ԡ=6`I*qG2AD7_4]}\䆴ȻD ;h5`SSQSv`,Si C0Gs\D7d=7@Wg/g<{ߛOofq^r3-V hmuhuW3%l:[KSO? sWw)d4Bq h b!$ewt_=\󾝳[2ӺK¹0c- O g{8s!т .K?]p$Y=+=,qnK-MP]qӫtœ{c[O>S锖F(KזKVa KFͳo1A<摳9^>;vD)zw(0 y.c]VW?Kҷ."GN¢FatQGΪ],)%C'[/>(Xx|x5( gPLa6}hC\WfRIlEGR9(J@&`LhW%y*5өH+:7`SkT嬧pԣ#|'Z*E]p> s>\ڭЇWxӈn/vx.~o@u4Z9b&]U[F +Z%I=ʵdѥ0{tO(+ oΒAӉftmv:PGnv͉ݍ> S;ȼfsUVc?*qpPB3w'EG gYpi0}'gӜA)?ER9sG?~vTB8qOĢȩ$:fgbCdqY1cq % U.%ayAŦ,q[EVNAQ*ORsRݩ8KuRWQ?FRA.+P+r$ fjܵ `=pg% #PXJ]ޓP \KJRh vz=u%o'ۓl(.M0&p8&(wHqlf$U^+r+~Y|Os5ÿ,zSt֒_%VKlyX74zId=yo9?bOo4qh,xЏn@\ = UXػ*z;9b:(DEIw-ӑ]צK.(2_ X)RbC \TJae4ȋ:\<5BKkֽx5EF)s_TiMe6q]_Z *Kˋ 5FJq,U[Tp]8ͫ't@>gBpMRz+2SnW%d,FQJ救~ZwTX-7@rIQ%:qe SZ"LlSa*(rI;_tWTJθ J3--RDnU'¤H0 c)OLA$@QĢ6UJt1A:о٧95m*hT)fZ Rwcjq^AJe2Vt}҅#GARMqc1աʄ7orՏ?%:~ͣsfsxJwPL܉ߕcv*&8򽁢>o;#x3 3qǜ\D#,M$!$Kn""=Bskx~?Ύ m3HhdSå:J6MἨ8>tE&bj/#r~ӫm{8u-F =y͏VzYu/VWQ.~w𴿵+Jul4/71- O[z]⯣h=ӥ~* ѻ\#ZXM[Q犰ˋj̕+5JŷpHԴ!1`lh|/ ;us-{ '\)xh0{14M [02k6GsX(wnPyzW#_j6.8ijERqNT{? ?ab[:j*pw7%=nv#uh :I"mݎrLKH2UM8vcه(}>T|/qWGE55Z:uhZaLs]ߚ;0Ļ]iՕQ3{z::Y'b=}2_69m[B?aoULx:~un@˥&dKInyzh(PRZH``L*ĢWtJjJ<ԃ'EtU,-rkKm tjVt8ZLqBcio#/շqS.cYr16R8.Jy dYmr.~׉s$ǤÆ^ H֯)`(\wޖYi<$K/g`>t_5nf$hE@] ]Х ]߰|@<MFIF,11- Qą&E,Bfm GP%i(iՍMKˢpR/ֶkzIW Ԥo;@6:JZ`Fc׷fT Jnc[*0t0 ]aX092dL̈,Kq#ZԵLd`S,BM)Uۚ۾5}֬ RfKNj# i(,DTr,*iG/5}Q+knY\AU+a?[CF9tnFME,Ő'6 ͽqQ”::Bc xFBb&6ٶ/jem_~ZfXpŹ,QVYz<ߔۭꜸ+ *!U 1FI@#b2D@*ٻ6n%WTz:{v$p쩭V\NrR, .%JivUi" o"!)K"8:j(chh7,&C@ 墴Q=h!Qv ^4W9+MJ"9k=~8Q\alԫt#"3@hʻgTqQȹpIit GI4V6% APjɘsta1~biGJךޟO4. |v۔F1alϑc4\N\s``'Ub)@5k4`˹ҬF\3'Ig=TV2ng}Fx<\ˑ04Lxg J0Ae[Gx ' )x6@?BNl}иy1k: <|`  *Y% *3 I==XY\rIҘ> v7sɐgDf'{ 3≌$PT[dp)Л+ xvM;5xk-vRp%S~ 49icx" :GA(s|w=h#T 𲿷ጜ9|5PFEfb2Mzh6aR94A Pqƚ+s )>`ۏChO"oTPiοi=401 t}%vG^)N9AaUP@š6)|D #T80[]%!h59uk:[։9Zu%-#0O!%!@s6i3 -urB3z+t ҏ㩴aG}hT,_=rC;pʬWgLR p4, e )K0HYA3KЬgҠMD͜yzZ{ e4-_':qOA<6EE/YQ:󣋂 (L!pX_7RG,t^uy&*mO|%o\M&Q35B(1FP3(/D,'h%+SZ@.R^`pRsPuP !F3NL1rN#hbH^;N;4np#Q 5U\ @UC\"[\n?s!-qK$+A u&XEuׄScS{KxБlq-)En CY$C/Ma.>jE14#ZV+fnC*9q\d6 FR[o $)eρbQމ*3r]Aj 99E( %8n0VR:Ǚb47[E[DF_DfpO Oib9-o%aǟvΧ2&z+~3Yм {++4hel^1CznJ"IhO$ һc4p z3; k*䁓XYqUTN>[i4Bw + =hZ%Ɓ%pBG[r AuL F:{Z|:SR5n\RٱMR0AB W(.tGâCjуSGBzޢ׌" 4:jgPb B<))++6G'd ︯FxAȭ ?ݤ}W&g3 ;0eB<҃_eG$JKsciyavV&}^3xSa+˾>,챬~g [\?wNFWm8y[vbٸ8j[|clgR >0@+$4]د+8LW%& sF%?DHqgU U60iH5*$Ŵ=qFsFr<75Ū;EEakg!c੭@t;psk}tWDSi]hT.RaPJ]qO.҂k-nxCm0M $fRz8yj{lK]Ͽ̟OfGQKb?~0< #}U]ݼ>eЪyIgZˋ6_:!W;v4Y4;3O\ [bOS`g 4K#(y&Q|$y$ >.+9-*ЮwvJƺ)^΁bQ1XkG}r׫e3g*rzQx2r*‰u=eUKYw׷?6 OwZ+ַRC?{]ɨ;۽p 2ZIBv7-APKv|}u sl(;P(!yTEmkӶ($vmQQE/Fi۶hmQ )P"n=®aF.7 ]vȰȟ/V18 |ns},Ab4-fyiT03W(:H)& Ǭe\c xuvyi鋊KK_$,S5AUqrR)AdD2c~̯vr)zMSWBhiI?cT mQkfM,Rh* fDJ "QEJłf+.2AgH e8^EAh*~/o5wuwL6wotk hÀ";˟Atͭm?#9 wGZԿ:?3QirW'^|}?O>,/Sÿ |ߌCFlmxŢE-Ph)uctR/ލK|"7`9|ߦ"˖9/;2BtpJ|kfvMC~׎Z+K#&I5K!hRK.(6 SHc̓K+T`29S3"kN;eMɲӸ3JT߃WCnÇ_+e4ZʗZ-"2IAu)`d/v# tqM՛j6-]Mà}^ s~: 0z2)2yүP=RxS4q ԸsUyD޿ٻ6nesc|? Cnɹ ҴMQ骵%GݤA;cZIEk(.93͐!$'zjסJys&ΖLh\z+ *- 0+Wzuznz.0C]c:*)׭ XC2*;/:<*HY.Ny6˿/Ƕ=>޲VM$8h𯻗\:0SwiEsDn[j,*0ˁq*Oך%ʟr2y(;g/qmN5}~?;,ӓvF:)>O5|`N_0_'4I>}>~ѻߟ>9*_o |O/w/rŏ߀-<~aqRI *?U.6q^.>Fs1 Eq:7x4vп26i{1Ň+ɛnC ~S#z8x,9ffPfpkitix2P8U&x tVU9|ʑYyn GQJmiK)te.?K:_`jtes޼ȏ+BQ db:J%@!A $H}IEJ}ttrpH%J]"J-!MR %Z5 NYQj\I2?)bQYngw֭/|TFxF.$ $Fk'\ic{:G%0ǒnLX$fa)|dDk>4Gh1> QFkbQ= <^StӮA}}I\ Ѳ/C"(ٗ#aQھ+yGd((DiA)5W1&?[Weawޭw ?pjpASB*v˺QNlPVVVDƭQ}=܊8܊8܊ȿ!~]Bh٦6 v@!sR?[/s"VBr*3e141C5h}@/} (>n1H%%' LX˳?ίXX~%_TöCWyFs (CQ@ J#:`cɕ3Œ9҄7BQ#2o K W%Xr},yŅ _%'#ΣfʴITr&5OJ*\'w 1٩ T.m=d\wc 冣P**yLH$I$y@$:*Ї+$R!drgڌ>Rdw\9JPrYTrLd4Bd&4C,y,{,y,Y K~pUi;M2fJgaI +X2KSR7?FI}ܡ5X=T\DTw?4a 8ʍ&І#Il[|Eچ j1"3FN"R8Q[鮜RR#KB ,`} 5!f,c{ɐ`~m+ܘАJM׼9eͩ(xAqZ,7k"XxLK,GpQpǹx")*ÜPDF0ycV"M4(C_$ kP#@%5;ɤ7 c@Hm#WR^UHel[A26A5X%y[XGC2bp9xam4E s"7 Bot9z\ ˨oN}-Qqz_¤":j7e. M_wt((/ȋo EHBe_^DDh䔨?S̾ 'zd_GBſ\w#honz=h0G&UGٺ<'( \ubsD%]*!`d}VE(-!T 3 U[PQ`eH &Dq6("4%>%`[pwPCˤc%r4!;zeR$bl T 4/<bP;V(V*;,Иp̠y>XpkP mdѳqաZ$ .1 x BQ,-,H5,`  Jyr>'61De(Sф ˳2H: 2H *b Dr";eC߈trִ`:ZJE8!j\fwHCJ*E7d1&ZMNțih=cʧ]'Cj%Yc*ȳ]IKz1 B=ZLqp #kt8yT(8Ugd^=M:ZGO^"&R1m3!Ή=ё D3UFWCDn'>ziVz{V4^|Mȫ*A(ydl jm0ogߘ-^\A+T]#fz$%miͿ}닛no͒y> l3WʺLaRtQYà<NyB-\m B\]h9&:sLMgs*U0`w=ޭF"rv=1˵}Qj2o_|qeݩ`wu!XrѮ Y6UbnaR %;̨#+U,O,4iN_t 8cZK ^]+a|ݞ*PvB92+$䑹"-Fڹ&BLfĶc((>gGѬƫWh-nalsۣ;al(]Ilͩos.`++ Ӆ3;r΁7= D šQDT&bCCU3ͥr^xP ;{1BJSV{1HSL+$d$ >9ͤ:j"w U.Xj*JBA8KtD!<$PTskz5ћY\~F,fG[%شcOCɩ3SL-3X#YfĖSSkll>?ŭKvΐ, )az7/'uK  q'yAJ:(|98׍@ފ"qN=n;S`l ,v)> %$OM(/.FLbgh5<NB?8Za5 ^pz/xQpGtneP>1ۄ bոEB(FG H'Q90*O`dXj #VDil*2 eC}H@ΊR đ7ńq]$cr `kC,͘08FTˑm : >JW24!#C&_@ hJC @ bg6B.u ;߿P竗Kr$JF# <:q\_EXFz-/ BOOOLO7!H b3 ~}h'DiJ -Mi}@ׇ c+pVQȶAtwsU[ӽKT^h<{KɔWLSZGN{X5܋:G@ہG}kZUԠ옰ZVWMi"!Rxμ:.HyT,' |nw. T,/T-yRI!o֢'Kb0ƺy!+rȏ|{T`-[y̟U;>1,1zgB Put#* U6~}mjl/c0Qq jD3688cx IZ,A`!N`o8R]/GdYT}o5WWUc>ly9h:7]?8_6溛;]YKM/ alY׻)ql5csm`%B#LWC W`#O 6PkWE{)줪Z, B+ͺLɀop TH5֔hwi"Q#偵NPL1ԥD2@)i4FXҶi|U,-Gz` M8HIpOp5DF*XrU mAiBӅ*>x9/AD=nYͭJFo~6V_՛3/2&[Wn9dFo5d#k~z3+&r>@,* 筙7k]#^پ|s۲0͖.aRv :qoW˜8L q ehMeh-KXl"e]TBȘHI ;QWQ{rW?+6S|ea}o c f,X0Z9f@XhLŘ]b!|{IU^PC}6$(h_7 Bf*NU+:$j i+2HDz':@5ÏFJJ>Bѐ@?xd}jq Tt''_B u(Sq@i(J6h t{)i~F=8eLvA%,6@CXzРZ`,8y#QRKwZBwWg\KEe/I6;@1KZlP$7; V .nI=-6;9ݲ7;Y^:d4\{:]lvgivR ?Ƭ :j TsVݠ Z개M INqWCI]`X^@Sn|䔆]㔂h[p;$ϸh -$剙`foz&!< [2^xA_bO3s^ue Wb5[`T[A昁p$]2= AB(#H/&%4AB1ՊQS)Cv\VD$+>@%e 1BT20sVd4KfD&K &q&XQLB(MS)le:",t FN4AH| 8Vy*R`r */1#Hjߩ1AhH%ؗEq+|ԯ/FQ;jLY)Y2ŽUW}ڡJIU9%BWhO]8BD#5Lap(!i`RZ],5ZLL#Ll'S). 19 o"FJcJ-5^@/jijXPԐ;T6M<%7|XĊ!y 9Q=[S?x\ʋ6GwپecBU&*KsYr2u@ǝd(!/0l&D y,h7ө&87bN5O#lb%:&t$i"G*T&>E.S2L1ikToD!{R|.O%rFeR `&%J pEbnQ='9ʫKR"RvNR[^:d4VQAֽܥReqE//=.}j纔ϟ]gֿ z~^ӻqڛLs xhz;˻߼XN~x#K=CCt[g/qSݼ_S-Kon y.}.gSTKP.0Ґ\EtJ]~oxH-V!:֭Ð05=к5!FLoTib-V!:֭å?T֬[4֭ Ud҈X#no}\y5DR 5[ $3}U8Jw }.EM@vX^EęrU \o 7@~,@)_E,֗/^l`@V@ϛ?L>  wa{{R`gw h↱'q?len0ӆ# YmP:Uќ$t6ݯvpf"^Y,Fo&  5I# )G? Ytֵ+Hؼ]D,Z\ ,x ʹIR,FFPDABP$$-PIm4:JJ=POe* ) FG G]@(sG)#p9aXbFi.Fͺ9#$$[L> TL_JNu U3M"\>UpP^g5c0A]zZ͘*\*Bބ¥&X d3R}S+PـR0ݽ"T6D׋t*EsyNxt]z^noUcc풚#JpYfN9GXxkb"X9#R4O!܂*3 Vqb4 SUN(a %2Q*1 %I]˥Q0@nWݞ$MG|q@$%(e-E2%1OQ 1$pʲD /Ԩzb3'䨠S\, TǴ6M,$9W WNIB?DAYԥKRXgA%+ĉ6,M9|fۄ7{QIH+-YtX$#"Tힱh.&^J sM넼UӞ5Ɯ;hRSfGS[ * SZqs NOw33ēT-hM.z!-Xdp3  tS  ]I\.ݤ zjl'Wk%ex{.nFo)gg꾛l>-zܾ]R\WFA,#'ws <ucle;MqΒq*K0H!#Q0 d"!˵2S*I8oɏKХhײ}9AުCl1WZoQpxr[x‡['2G[_o{}do3vnb o|gn 4 5Cܼ7P̀œ'N:2^ng=ʀa0gl~L#P-SϮPL|\nV*cH,8 _azzd4[3/?}~%\nb݇]߯7+) ѷ('{.:"D G|ݮ̂?C$2 32ZIsD9A9w"leE:Htط@ʾ|vfW{b@ِ "lL*9i}7sB5wOf|4pც|b|e1.-ѱjq/ϾK\lդɚ=Y'kd>k(F)WE;_ Wm4ṗecz t^zar8JdAr8S^f4h$!4 K0REAÅ183Wc 7DQ]nQ>O jQ=gY˃[dфdkz./,2EL reU^jMF ߑof`P>JC:M,9<.;w<0-ʝ .yͰ -_O~|ޮm _*_"U'17ku s #VrwPaX5ȵ0T TZ|ġ91KԌjD#klALO>R-H\i3ޅ`FKbM$I IE U GB M[E;;:AO ھbTv>|1ِAu}Fosa]' Ds],a6\Rz疩r, %zm۶Zhl[-$!dm#IHمm;[Sr2غ̱ &Mt4_EPAPtP8rs]y2ٔU_/2֝9!,ee]E/J׫UXxLijԧ%4\[wh1iq4 r_`%:z0G!KoK>Nnl,y?Μ]"S6<+NKن]Š ❧o?|{5{3l>ٻ&qWTd/Sq{}^@mE/qAr$J#H9gUln4@ ?d oֲbg6ۼpqkmZT{}\kyxr>x|\c6wa` al}GՃx\,hTݶ^ h-I- X_`h_.T?tjVz ө|N`m} +y!wk۟|oZQkp3($NDx{o߀uݷo\߷Mvh.}ӽ:;/]wZfvz`o-;4 v"Um@.Z$iZ[S_g_&+IGE~M8-^ Ż!r/Ҁ[a;=nfmy+[ڗHErcjMpV %pD V(Ra>G Y+O %-ԙuԵOAԐZ!Ư.#Sfb ݩ |'X ņ*X,HYeɜD(BW!?`0__)&IW,|蟓kkVEb5w2$ u I\l#B9u0_G~ rLmRsד*-Ƅ$*I]|JR^B=_jCA1JS&QvN>b",XX#!!5ޯ@j) QRhr>,׳8#@,[A#@ 8t|bg;_O\6q jcbٔ>l궱r߅/wsq7T.'KK/[#h*]g7RmEf$[SiE3h R\AtS>DqmTnێn-ݚoEk |kj;u]Y|?.3.'xtWi~`Lwx aTrNӶn3}o愁L\ikw&<_N&%|ݣ|/>WCa^g<]~7l~2 :ѬrB1R 7'(#|\\SSJ&Tk A XHo,0`ay̓nU29ojA8ܐŘ #;E,Pxu8du㰷Ny2nLFϔRǔR~v.9PP ^OnӔo FK3CEGDb%% X3F- "RyQehPf=WkGH!ErɍC8PjI*'-Pn'|{syF6 WyD٘bHE\*N<Ajc y͔O (xƝƒ}CR'`DŽ`ĉ.ԩlFq- lVz{$jZ37y^uֲ)ӝi2S=vL?,A7` |.M9]s:QHb$8#;b9wt*9;F٪h&hY>bʧ< &#EyD <>i' I4s#Zʩg.Ho+ 1DwZkQUPv{=؃VLO.ؔ3N*옉Bd1FX"&r罷1v6rF4Dw>>~UH5%gOjkw^c!Ŋ~WR>,̔J^+WA})^ަxujՔ dIȌ،=@F1X\HHZ3eByb 1HSOyh^ Q%4BO>-"!!G\F-"cE*Zj."B"Ŭ&|ç ƞrba#{ `%D et1"yJ5]>m" Ҫp5rԅ|K/W^"chl1֔P 3H p=ԓHS%/TDf$"Ɓu`z{@"ä%HՃO*4W\"z>xX]R);6ʙkT(=ma6dt0KHfzmzv~n͹rNۭ$\h( (͂Ja?XJAsCW8~qKHgJ #>FaDZrm5 @Py ^ STW1şrS4"Z"| u+lJ ]ΦL? <ɢU<~;L>'+?K=ϧ O]mlT/Jݠ 4y9esSg@”i?y}rpЊx7ñ[P0tbk+"] @=h?%(0.0~pHDzp0'o(Zqcf`.:DQi3ͬ'aݑDܼQܴ߉x}A\>pC:0?GEUOq4ݖ,`_Ї>ש 3@+[t3&> {X̹AaJh_cKU/C>,ŻuЩNjUcdiUݖjUN?כ4^[Zi4ۡgdG.k4"d3.oitσ|qJ\@^齲z}.llΔ 'vbDz>Wƾc|6W@  跚:P>WcZ}ڮVѢE Ux+՗zVS6tljUihuQ*Z>ZV c$Wql-.[Cӈ%iCAsi~mq+ m@իSs'CSmIJֳwLV>f+$D=iV ҲKJ5 ]sS/$eʭr)0?RUR έ :`7#i#JQ]dK ;Μ@26J,׊E5TE*R J%U#DhSC<Ƥ"ZJ3O9 Q ┐4g]. ]U c$0?W?sA&X* 0RV@GzE+}PL]r7I(ѽ;>D Ö(MJ6^v R\7w:})G4I2wmYnoE|):BR I/1uu;R5m/1+M*s9 E/3^ crlz 4IuZ=t~0_>3dE`K#0"\fPOʧFa4w\m *U~HIg} t{G5',]JgEKEwn1 @ 5`3LKYIa17wiEuz ܥRVlYE=`\`^=@/)+M{eadG7xv˳¥Gn>qr۝sʞIC\Vn0: }J^lI9=aX k4v$3U#(/,[ݒd[X28es?* >l$ bmGߧ?E6GaKxì.捷YWCkFvdR\3ڙz5%Lz( ҥI@0ĵM >K\2bC靹E_۫]z)׳&%C7on:{;9_Ʃ)clHg,l5f0i_ػ`eߞ\ ц EcTgKjRAziuFJ#d;}:} hgP^`WL'*HP?_|Y^=&Y0Wf_ <qY~b9UV÷ ޜer!QEՒpj@ϫNfG{y L/SdyykEdϹ\"DD;r S=8NՃ^plۣһwfa!"vL,mkP}>yܗ#J!SDL!ih3~:%Lb֎38_2؅g?R B(Mzor;l{+$ ]jH//2r5'f$vNc-vʮMʲ}K(pր'xY ! !}~X/2᚜dʀ_ʦ4'wsEI{0 n4<ΑQZL"e T{g<K,$'_81kޟEH֙rT#d^βI8d_En$Ь?^hE^v떋m{=%R!haJfIQ⏅ Ђ]`ye Z}:]4"8t˫A&TϟgyRԐ_i.i|wm$Gw!@ɮ œHJSGLM1>K cDH+~ l>j+K9wY۞."ݎŽw U]}gXJg'ڶKJR@O>3q'X&c߬ 0ހ)F)Nҹ[Q(ɁpD)aiIT3F~0+@kŅGR-K2L(x qiF>DHG|7m $s %sb'Vۣ0IH=:ϯJ D1:FW`8agZxBBԞ\y5lBஶy0Z;weL~&G\i}z;RE҉?^+mIv8m8N;KnjË6vR>SzC78wMY&ә~8w-*5Nj;iv&ڱQQ~՞'!Q5׵#j Ey~ 8rR-.EoLB&ą-d"7}2K 'qFB>EӮ# c^µ!EJէ`$hQO1O nJN~%&OE8!DqjXV}Ϙ$@g|3Q;ej%u~*Pu;QAÎjڱ@^Z,jB2gŭrJf`pwy0.JX1;M !9r:dCD "$I%nk&K9H@ȥ7@^jgc 8z..閔6_2a0i&wqrmLT}pf h9B.If~8^}Eioo2m͂ =o|?s1b ]Z8o 8b:va$K?͞l1,?J 9mo0d3qET:b[3Cq@> # - }Mɠr)3?poVv;fhaO2fr9`\Ǎ`Hͫn#Q6aN" N;VR{>h^soVȎ,F@N"Rkd+P!}(Ogړ x4KLyfƫ% rY[A͑i:k ٍTێq"I:>y m8z_ }QԖi?tS.oʺ7i}TH\ 7r+} . #苭OmO= E^vGQK M(gC>jϠ= tysچ$&喘T UA Řv)_ úZu:63mϽd8sTo4SDI*ے|@ЃMHZ7H$&fZDƺQ> ڂPJGPoՓi 0$])W/0q8ld%mL c t"=DkuIfzLXTD3-/j 1cn٦*Џc+ZZyN$E {rz\ Z1W:ҞYL鮋'֤KawV  ҵJp;^dToIcKC>E@vN(G0_ir}4n᜺Z:W^%ır,ӫ"':b8.q l2uoQ>AHadzxJu|Dj`%K` _cN6b{:ŕGVٻkxlz9@ar{K.3DqB4{\=#:#.ހP ǘ? ,9`6Y۷xQZ%vtoT lӋI=79x(.'{j K*00p*J(wKƧ6*SטI}{4 o1#`&s;9˽ ®'c!]e 0 ԧ]R,z4GU Blʓĺ58Zב>,OqK!mƞ p4x<Yۓn!@K4ـ*TrHyqTEr (-+)r+CPX+Εg1V5isWlDHW`q:dS$:.TXpf~"˿M}JFb4:C>@}'i6Xdȁ+Իp5ge%$| ?'JcG=axbk3v?@*?OzO\(U5H#0bj#R=`Nu1(IIzSH9S,䡹xL  ^ ݂))-͒]K79RЕ ~%Üz`){k2Ph~9X^OіV( FFoXd<٘|*phWνuNImBͅ$Tډ[0ԀQGgwficPX=7?a=fϵ<z0tc )WΞq;"ph)u─&ә(OyTךO\9tDD Hzn{RaHV-PuzLF2UFX7G(M7֐_f3]d0zsoK8q!٢K 6Y=S fȋ9ac 摾#^ @?r{فNN?jblh$m<87]C,>5ŹW*˵+]Gۚw֜?ꕊYܐ 6Uv5p}RbЅ@ &!@.oI؎gR!vlgNxq6c*|ɀ+ ΚPsx$SdN6=bŜvXoHѫWJCf|hiI<PfVB0ԡQ|PuTsw̻搜呗_ ߌi27/E>C <^Mϵߧit z5;P-GYqqsF371ϙi2+'Ћc PY#)a HȃDpߚ,.|r_>j[.{H=RC)ܦ6)}Lf$lj-1DuAv<q0qE0F̈܁Z9Nn Z-|qjn 'mS>$[ ݰ+*nzJY,֟SU.P@xO71`P|*;Lo|zno<#'*ċd1I B!H1PGxEtgl4YO)`hS.DYN(ap2>`?Ijt~\4Bd:ݜU\řWߴw%_) $#' f铞ܴB66DC!!QKOLTB*7W8ڌn ?XK-<&+}k`{Ը&D;,zܗ& ~ܧjF2Ӭ(a)N&c3$z* zLRy|?.XЈRzB@!00BBfNEա/FP@Q"gCʑbGg"R8`c؜9Wtcs d4=fGnh=vC|Q(^nN`VqNymɠ0Oj,K3!ԛn6nLM][V9.6~6J$ȥ/Ⅵ2|Q[q7ߔdfFW FiDkdiYT{(= RͧqP+B/  ǣC $3[7sQ=;F]G5ZNTGwjo|K [_Kr1TFW8_wha_7_zkkx)ҷqyO ǎZwFQ((d:޵6mcw:[:_28켻LttǃԖIn/H)%(H-yyw磿O&}<OPyh7usZ&!B_o-@$NOBը7z= TPIVoukVzQU( \|suxQ4 FgJX )JٍkAkZ{=%`Mtٙ= sZ"9btSvP̎a @"@Α DZlۄjJD& ~,Nt4SF:ʋQrfű|YC (oyh`Arq^"Ryjբ2)a85{g'o7;.]S;#IȞI͝KYN hVNd UPF0VPds-sN3Ve܋!'NasL˺k3Hq$1`:pH`nD@|d1D a'!8Qs AǖX(,\aڰr 2@\d~vx+:SIJ s `5P69BH4.U|@ ;UH79fx$F r^<" =iɵD $!ą)A, bf  d(;gpJ>ۤHp{vVi*qt.DK`3# ]NRƺC?BHf5!q@f C%fz/kuJ:D[ctRU82J -<$!nŽQ&7ﱝHHlON>=0dהڭX@ZƜLh2Έ<̷) KۓڣՒ=, B< B7ZyyRbϥF{6ꀤEN|].d"@H?dgᎦP,51:۽ RZ`\fL AUwɬOٻA3HoߓޘJLkT @wL-6]b{Z:c,Bu9!ZN.ql_pZcnQIB WFAXή,Q&!gFs$g]5&nZs& ݨuYJ]S|8=Z284BΠīTܺ U)DڝVm;v{#8 eV&VGU;Uds5HFh8m5݇mfKLP`W<֟z07mLkn;x5WEC!I~uz'@TbZ:pK;$}z RX8D i{mdClecFϡd`~jp*x3x8 )002h0+f(Hx\JAl,G:ba&古g`a d\f߷ Q< S!O^(AC´ Z)r'X* m EO1&p`tgb0 {g q:_>]_TWGKF,,puM?"8nnfzcG~W==͜R># zNݼ(xb8CPPQ٥Ul9#. rq6~#;qm/F ƱRzn+(A<ђ|"$SxuKGF;ZCdM3fǟ :x"!Cp<}4q'4kR*n>}>ېo\D)$uJnVȭ!HC/ ם2Ad&45SH:szMb9@p 3rTq|@3QzO vz:%S'2!UBqbθtƼVg#1e;"> !?~hgR7[ Iҧ\RDA^yP7W3INsPwyn6َ^~&W|>+qJ*iH<(SRRh %K /J:_ >T x($,N%k~_~Kiy{AQ\vAEXOuWWJ'u>[W9އt =1iTɶ滛`+"{ J٣,vj31(I8ccڂ$(CKZP먟m;u$F оHW|! OmthlImr֫"$o'12.)T$ \Ήr8]$DI`.QzhʒJznFf I"pFtf@뽔a‹E0\_)ҩŔ@1 eK3BHA2K B,ZcKU)CC"zİjJ}&hXf3{5}|zaguo FOsÇim1ֵol'V(т[(?^wd.ND_sI^5zPTjFK.G4Kcyau3I&l-LIpT[5AЕb=MZ ":iu*&3&I"(*S[.4FmZצMGCTD NmX ӢP76MiUa a) d{T-ږ!LIipGl>;O #r8]b2яҗAs}P s࿯қ0EظaR= WE~bu/s{[^f6fH0)X@ G -uhgEsc FǽA)kFΩyS|~>zW,knUv|UK٩djz.nαMxQD10sI/2Lg=e3śAsI!5o /qHid@J01/RZԎ;`)/}k"[_,w6@S]xSv}wRxT5uc!\ܢpʀF"H=6T,@oa,AOA8y]@ƴ3LVU9tqWLbv? !BfD_`]0*+N$geOjceNXi[G cPqitqMͭ[f{9iSu BxsV{mbF ˓k1,h9I*VmôCz 8 DJ s, X$bINQS%+ %5 kǓ!8 Q?GElA @*+c׷ۦ {?1{YM_Lv;J&;y0),Bv\Nj ".H\zY+֨^)[nh{o.1*'zzSIuג|N|peoY³ьg#S|knGoƓ5}?Z᧱M܇)hITg#0ڵUa<+"J(/eqgGMa!ٱõBۮwq@b@I1O'QSm`3`Hc+"Atwq۬ma*&71r*U%5O>'zwh#5} Y'f{/>򘴭u=wX<_QSYŐ(cR рfGqm=D ;世fH %>& q]1@Y3#s4PC7`ȼɊ-dU6&S[X ^?ƽVI+D:0sz '3 qrURܡZoT!n Q:9Zycd`:x4oT|Na!)WBa}!(S< BPl@ xvbɘGNKYX773jn=񣦈ҊlX5/$^v"&c8(wMR()8r9<aɕPc^?Pά(-1V c&P1,|>t|Q|hٺz'HK-M4?^%74Dګ@ew*$s{4e g$M#e7QZGha 8o 3E zuuUMWDHELz [f;<)gBPp͂*W0dFK5k7pUK6].LS##c}DfC45GѐL'L0 /Al]2 ^/Qp_dl!H% O BQ4U \dKlbj@3w2m%e.˘!0IFv B$@ 6a5pòJ╨B%jzxߍn Fd.5--ߣ)ŏENImNC`tI2Y,Q;E+l8HP-R8rJ!(iL}tn@Ҥq;#m v RoĊ8ՄJ#`M2PDCH8B1PSUêlX,»y/o_CM@`Ի`xڝހZoPgr;>hxњh^=s?9]/k~N{Ӻe^mk06'{!g48;,w’M=c{:v)^Ɠ|<#!Z?u=7x`.u_͛ )Dɻ_nRbe" y"%S\5>*rCZ^vAĎv/adS2:>~<;yn~D>mȇ)HQaL;+Fkΰm]Un4 5)yH-%M6/S"PTߤٶn ꐊkLT9꬚S:KFNsW-J;%8{S(Jl{CpMb*e'J-N38‘p8F,hRc8G $ECmN5JSGDL '&1 1Ffc𾉖Jk :# ߃8 y 4 !ŎI؎-BBFT)+MK3FKщcVusa޴[4ڭ y"%SJM8oh7NX#P DtbDE{nшj*$䁋hLnmbl4̞_g[kS v6Yʕ4 1W!4UfV'\g2]oJlD-dl0ƹvLda PHq Qm|dmhe{~ 5Y, 16_jڞzKtP y"#S+Mu##bi":cljt nUH(R Fuv-!;&ڭN/!ݢ nUHjK-hPSNx÷DH"Ed4X-%2!&54AX"TI)ͩ j3HWm?뀰9\&>RȗJ#_b!l<jBeg8eNA, BmAsw⋔UHR{\Inh7AI#I DtbDhOK[vݢ nUHbH6⼡$W$h-nͶIEHAIF"M2^BFrsDtY(n[D^{I;|M^O;OdQAr߷_ϖ电>'aZV۩j!ބGh}}NF i߭ІCFReDݔheZDvu$rSdL HoR+4vi ip#@!n797:;[~SO,Ӛ.o %kT L>۳уo/L F? PIhl DǎQ-*$䁋2%d7u׶ݾ@TIڻW!Eَįm\JՈ0-v*ˍbVvV}%H-Mf s~&n{m⶷exE7^/2aQ@H[AdtH8۰&"U`>^4ݍ>3PaQ4.hig㖺Z`к|VA<}#R[Ո3`eL|NF8uo2nǴ*2*|xA.o^xHfAq@Z-I%܃h#ܞ@Oۧ`h[?1}0O*n[ugx|.~ 6O?ݺ87$Bэq;}a$NGGP8QqѰJ yÚ"!/5ېyC3dȇ5F 7D^u AwD`+>y,W_'qVe`mk[2d ݟY0nZAu=9p_2d!V׽pŒ%+ջ7nJ"OV l)-ag F(̽혲gn\ư1)pWdKȝƓvnyBt$S 0\L%yR)K}%BױWU, v"Ygn!TZRPw$V( Z= a@A׮*eI)GU)KJyu"U_S*R^Wɖ6JK qz;ԁ h٘ 5sk+\3%G9##%o>f}RQۑdDT(nuX6j46u{8k߆q2l߆J]+-柤[Ik>wXƁbiNBwfד WETW{QOM޲;L}jVnCl4(Hzx1L@zř+n˾(ay2)@LB7F3B#D Yd;a]Yd5cEH ٬A8{8"cقXЛ` c֕U"QT%$V+Ey9GH~؋y?&(ߑє*H$%C}~&[4 ĜV=gpkWAĨ5c4]<]QKjwVD)!Mm螂kjSh=kvM9Cʴsy+?&+w`y#(ߓ'os3-Oa :_W`:{'q??{Y|ܚ O`ӳ7}zrqy__4\{pYo9|5Md'}?<|_B2ǝ//_ So/z ^P.7?' ?|~.|?,ByTP:V0Pb__;YȰ3᛫4?5:^&yX3gWO\M^;f~e-pNY3edt x/O ^埨//~Ͼ~z˜j[7 ~`v[x24'Wͯ`<N=<`?zFt~Z:irSAT>d~ ŋūQhsg O_W_^^z8VW'w@/W=`qp8scF!9su82<۷79(BI_ox vofYbƽf,)Bc'M]i%ޠg+5?4vӓ*Z,K7祝MgXwܨ(zHߊU8cF=pakwy GerQAn-'^Aypv=KR 8* q$E5%i݈VmH1BQV %!Jz 3I'DLvo_BQ˝{UxjS}u߄ #e)e8:&K MxIF$PL0b%ixLC!Jud]z ,fY(=Xmgd-BQATS#87&D[7*s+N"8c1>ΰ\ 10+qltKY3v#8{]zhf9?[wB}/29;|&I1 O|-1Cێp- ocis]g3d;lV$iN{ I0X" 2}iӒn EzObFф+=\ 9U'=@ͤhib @#<"(Rzw{?? u{&2FA j}_"%v:irj 4Qqa8iJ =eӌ1 KZ4dP-Z٬F ,A&Ew|9 Cok&s`\cɁ]oFvfdlFs`0;'3 D#& 3 ʹ1N8BX ܱpv7 ́(*6vYpLѾB- r8WtJ?8AoN0-RÍT%k@ɽ ]W0npEHvzI+U$^X "{S.Up DRZ$3Yf3Sbj-I"@0$NbH,KQ$T>F}Xwk*.S~L[(dfka8/fڬeLEMʴLLe_3m3m3mfڬ3-$ڄU%sKdODkx 43;- [K>jjƍ9 [u}تV]V叚h ܆}Je@cb5'Pb* fb+A6f@Q5c_( |l]io++ڑЀ?%N}o3CںW-wI^R4Gh ֈ9g#PhRm#~m{qXKX3A1ݻw3\71ZChd,Tˏi@S*6\?8kT(%LrUPهfAͷ4rVզEw|K)yn-.&~TlJYxhm:>i:]E}YSN ݰl|Ah݈[[<ޮ`vj_^ƧwNﯭۥܕ.ZF|= Uv.UҔS$PP `,|:xYLxvA9ڌ=y|3';kip'˔͂y?2őzgM{nCӳ~B71Ƨ_fBv IH rJoAǼ4-1^N0Jq4[G'P|2D鼜 5'"4L\HBQ;2C\kCV$TWBnco :Nb=<{8xG2w=-xʹ1"(tѹ`a*NB(r Q|´iuf<됹RW#IeQ̩D'CNH'ce=Ƹ@a[ޣZv K*HĚ8ZfkH$FTsB8XDq1uZv=rZ958#;Z}-`W%V۫3oKIQE,Hș Pg:}||}﫥}\E$.Ffz>/kd1ǻobb5/ߑ^>z`h[b//8ztϸ#bE3{Uj,v{ kFz,P߯^ dßcࡱ9ywԇnB 0& nsƽ6%n' *[G)—ߣ_hj~os7y&LJ~F쥃4G=ZUzzwVznTDUGw <ߣ6i¢߳(TOaUx4&u2|B_iSQ<Gj6Rw=QӅbr\8*ZZkq(9ȸ;~ml?հ$ ̇s `dӯaw1vji)#LEPw;3&Ă;# Itu&|? ߄5 $b8;9cA7'QbN 3? SDimn=j³Vk #ۮCa*$vk&ELTNVqd!Us+*һ]$|JҮ[3=g`&mřJ6ݝ->go$>ܠ)BTSg%H9 dbm*Q&4<.,ttsU^Gnp̝6K<5j|K I$?s~]Vl<#bgn1_sMŹ{޹{ݪd:D b-&hc1i!6 %B[wiml\[ o/rA,E*vt[U|eezV1^Tt˥l绳Mg2Na8wy! nIfy%ZՋvڳVxѯ  VyѯJyѝ"/:yѝEm?d%%\("ۈq8Ny9o•y A^tQ)/n c^tDx!som9 +#){[NzE YƊ;+;+8 )Pa1= "8n\H p=7O"<ڂ@uIH`ĂKɩðך!$fec} VH2Z"H1ߜ> "X\YP7G|+4;ۄH\n3a2YE4(.$~ۦv6i(\;a2]P୐c󖺌?qEӫpLwͥhEq }=#A>(8D)i9mO?cʉPˈ (R6z ǀ .Dj+,ɖg}DI`J ܕ ۳ItI.iSS.%]Xd{CgO$LZạ,gRjn[ϙ4AXdon'!St[PP`]hB)t(٠G0$s 8aT$քx) CZ;$iBF c(a/X&>Znp&@ !hbB l EؿPR j*.aUE䒴2?p-8Pąr8]^4 t)!ӥ R-%%dM!(8_\U6'I*8_7'h|)`BI!evRI}#ȹ¢|isZ&^3Ȩ⼡Ǭ4%q9raa4/#'AeytU\x`tAA: ЅFq}UA2\tXb\ٜ?Q)(anQ^_ˀe )veO|| #μTKt.wJs^5 ^7_(r[TL\l3*k[|R\EY%%߻;*cI4[F5 MI$[:[4L\sWV0J05Qd8pV i!."dΆ!1I6 \bRX +XM$q(95 => ԧu 'B(Ka!mʈ%' L-RD#eiTH` 8} /M4)O-\p?T# !% +tB ${Ƃ)͝B2DHtFǞ=2sn=B<~mHřYփS&8<;ƂyJ }/^!!%U<˅HBAU{IgY ɞӉ 1ۚ~dԩŌ`<7a{=Ƀ[G+5Jqs,>=nި9To ٣'nYw|7T>OMo<|^̝}0?\_c)r7 \?=?vuKz؛''W3 ?Y3buEœm%s}z1Vľ~=l߳n^p|PeAE<Hj#FNHiw$-}{u`2|v\5.4:umقUkںSp!lB3&a3"4уE:]"Vu2y4"0_  fS!a$Llf闧))%s`[!0ޒ{mb`~ O+6r&9ew`l>E뺫O22t 6'|R8bPNT-bej5KDpC{1 q+ ^.EugW.@)雠>Mx,)R(F"Mb")G|!]k}=h >YRSwc#- :h-`oCn%yTdP gݶDK+>OER6#A9_R Z LZs-aP2+F cGQ*c1FFbZ#ܪAe1* ;z#x^qE2)Eu.y:>GGml=)'Ҳ灐w],$=)e*,> kG,,A[~oI/27ʌȕ9 su,[M CM xA8jvۉZvm*$4hThau(Zb(ZUFT8*f *L5QPjY Rq啔*dkuۉZus!dP-eBQl$ |ՙr|sۘ.R.٢]`0Ţ),|u%K16.+h,KwޝMl>2aњ\oq9.;|>ȆX߾;:>[~ =F .R0L"Xе:hI]y,C}+%3=x|U&*%C?u_B cyMx)[@ЃS/Y_}s%wquun'fa>k`>C,՚ѽyy'yK_$,]#@.V?`0v$Z iZvl荮;?Pc:K0,*dPق Jp5wc[HU)vWL[P?,(X1*ۧ8^|$9PP-jMw)[=#x7]_-_w F@*둽V??2cFK פZP=1"֥̽z@Q:bA0:; y&:DڜeyB)vS)RcRUx% Y>KxRu`0t8Cq%ED-MhO0CQSެEȱgSՃɁ /_`_Y``ɴtIBF( l7u$,I /Nm(ye}DvM[!i%y2"9S[dp78e j+DPMr|<Z4j<ş97{ɑg S")?ZBĵ0p~9y*дLׯyJfgߟZ-oQGG37IPg@_QDѥQp. Sa$Z8}h42T\jowQE 5 C0d$:LZ2Z0Τ39ܷ$*VHԶZfIZCzUmK7V cJ񐚋'C BW;_Yt ˌrA ]Y_ 4X0)߲2C5W$i(O\dc7n !k.Î0MQ2v[v [*5ۀDn\ٍUkX:$`w1N0Y,d'?(ZnE=r{T۫t|])eJ?T ~ >N ~ BlVmɥQ[xښ!EO|XocAq|$D2Z[eVQh,HeVhmL8|7U&r|EgדX0gi™/,3&9ڝ{ s*:| `IDµ_4O?W>Bho ] Rp B OPgy 0RQY ` ^v\h_  ?H"ɡiͶ "AW/A]-ҼT@hx6?zh0:hђ:ta=N Bl_4w 8$2IX&-5_II d5G7*!Xmdu}U1[{ x]r)Ke)fEwɟM7 &@hD[dZvJbBY:z4-$/N1^8/'E^Y vFbc΋?OVWu@޳6K&~>jtk'W*[-JXinj~uڋmtPo<*[\6ZT_YɂiZQt2K}WL oH00hnf0ËsĭAufU(RH ryZ '8AA\2‡Ъ{zazהּ iNB& 70X45'-󎭕Z3j_&gG " 'uZ.,e e!/Syy98MVN%l7$(q[u~rBm0*(;]{AV ݩXq"eZfߩN,6hy?|zNê5̦_sx5Ӥ^9ل #qZY NZw=6h,Xw"R1Bռn;c| %NWƽQ $B 0<.tP#B(^CUp3VzX1,UIؙya s(z6>=,f__'񴍯d1P>G-ܞ4!UR:$p(GW.f`֨BŠAA!+ lq<-׳_Dxk ?ىاܴ 6X=1iW> ,Zj>|t-Mp6X?B?ܻxiYl$JU* pj`r8IZ\S%(O^9@ꨧ,J*{]D1Q{٨ܗ|~GRl6ARf9tLfCۂ0:* pęoi䈦(yoq3}?*bG2\}E,1ݛ/RR{sY4&^+%;|6ijO; EJ 5=>d5uXCp!1 -* (0!XS3I䍊¨3TJ tGt9#c-F0)\@s*bB 'r`kڎfA惯?܄s4X0"-&n`8]{Y); O~Pu)Q,?GZ 5 ZER,1Bs J) .XJe<3fJ8b q7o?GS8g s3s5h;og-#?@7_9_ʤط|oFJQpZy}ZKs}k|❯Q5OQ$X9#xPp3C)A~;L|E&XKQdzDgf(.\8gf/fu@g:ŀΨyuP`>0XE%eK! {FKb5#luZNN봜|o+ؚ*[___P>3߿zˏg}][oG+MV~!òY"J\S$ádkS=$!9f4Zav=3}.]2s~ɹxTkA`MgJx6&xrQ*U]Wj%@i+XN/puaK8sD.bj],°}Wyj0YJnZ[F-QX|h)̱N8'CJ!hJ}P6Y ly'uQeN0+i݃z ^_`W5PrOBt:^6ni0M)1‰5. mK04WlIuqj u c>YbiyOYdY6SmIǬ3ǂ xñ?E1Е[폡Ig9¼}z6gwyve0= Nd{ĭK1O%zqZ} ;ڴ)+32euˢ!]ð2$L/Yl5X6@jHas-} dYV`ϖ Cbͽ$eݽ"ޡFZlFSR>D2VpCG!JC{I%.wqlQBOb{=i*&+o4oѕ6e6)Tu/IXU?&V )^#]OlE+K*0-uдft Z0o)ij/)vsۤ[V4Ŋ +I#LiUim/IƾzOdJk fI]s[H7Z;̴hƃ)#Vf僆}0g>&4QvpC ڥ!>ۖGJ,25)BQ[rSX@$a^83ѺwqiJ$[7҄Z5+xS-H&ӭCX&1(uQ!&YNLpp/i R66m0qTq|=ꑃrU]$­Ȱ _-M(Jb6Rⰹ5Q| {V+ڜTb$jU(CdTvUIX{kzĐv֠)Վ"8Ŗ#N0agh?(&sɥ_LLrFE.=0-!I&:HDyK4R*p+rsu,N`JuP{QbPP*ͣB%` IAJzO]>o8#+x<ǣN߳I0FeGiO-g;1 H^ޭfzU RB 5Q'@TRN )QRX0H*(0xlp@NBN@)&Z T-& #.82!J?P9E;+#V "Y L)0Q JT0sSz1]?ti t_<͕=3+N_^^rFg*O |BWdgzSWcxE0= l:'ƛvOq7^7M_79^oCx?uoTb;Wgg(MEɣY\XR u>!k}̔ɒ5p3ݽPfD‰Ix4 $q=fRK@I`_ט 'dҥ.ȏɋȏɋX"WҥbyA%u> 'v:xk`d`gh5t7lYԆ?ZԀiEo8jrB%!tj^WOຐo3{C nFx 2P qE$UYrY.CL3\#,*ܡ/]C~ǬsYY5ZV>Q1sVfz+dpH;Yɕ*hpyB=F}fG>&`Dh.̣ߑ}4Z%VX X0G6H5YV7R~GPI9.YY4i%XU2p0<1Fx{Dr6j J(%%ƠPL6 @jV@8-elYhdY(fP*MZl0FY30nSS,\ZLtaKc AXJM&x2I{"D >v?5LxfPH8pJ3`p~'AfpZ2_At+\Pi lĚ!?N?Qt BŌ!1I5TÁjT+voJ} xmqu.KA@  }el }m{XpRsU>* /xgҠ'`-Y` NqO j`!py˜u%N0%N]ke )LH>S~nVO3!#9˃ tՅќ`H"fuiΡArlK(pAWAHq >Y>.[]d/ Lm,T'h i'j)eȥ٪jq aL X P GReur$U\mE4o&5Xh.=Krֲ̫|4ֲ<"GD`nHrq"JZ%(ѩH0ޜ4:k*%A2̣Lb7h;7Q 0FS`A-"J*Ā31@$T/łHEs]ZKTuFջ|Q&AZj B{,qPUCGpIB20w8?QDKJc58?S2)A*8 BMaAvЁ5MqĶNNa=$S &VW')6]b:R|s,׷M3t~P2K;}3(.kNzǺ}E5z|ȧU6.He"ơPML1jknzLZ0 0R"hf8 (:A1s$0d G&;|5ĸ?N:yP'yQr5vp9C~i8<9 Sa7f1?JXL4tai>NY&Ї/vU6(̦Ra:=hZs:T>E[`^&}X·d:sԄπbI )j'f-̉T=Ή{HcDt2d|g[ $JП͏fettv:\~]O^z:o W]}_óߏM>_'7,/7P ϷaƙYoW$\W+/(gӿJ.x&LɛDѠn{->lq D9~(d٫W(L7 /ūqG6YSenJ9rj,JE/Al: wMȜi]ym뀟IsV K'gM&WI_@9 0ݿhx g&$_`B% ,FFa.\|-ԛ|.rLʂ*$Vr^>z=8;!XS`UHJK].(2}@Sqcot<PgK-:S TMAIl0Yi~`s>c5W.Q wp{T8*>=y|(rZ2齌4ʫRrl)S5!Cb'z:%g8}NpRCEkfp5 0c_@Ɋeo T!u[0`߇قJ:+~_0N/|BCЅ@hr[ 嬄*K~:ᢰ)@M3$Cq; 0NX(2=6ZVɰ#aN6.j(mF\3xIX '#gQQ/62dDLPL6h,P"zg끇5gqy \z$#E 1fTR6*xT}0$Vf)!.6(ε` 4F|38WX"HĹH%2ժtkLRcerE*xQ7*QA.a1śEN\i#ORO=DLsx\%49iUxg5U=d!p`HTnoJ.[%d9X L$t4`UXRnր Dk s; Zm&Ag坣|oWZ!NԘr׳> e`z"1}]Ȍ<di,v׳/Bq_zJ΢) (kӤ. b'KRѽk*DDB%'VX0p/FO0V`J]N "OrWcމiջf9eur0F7̸.Ws]N.KeL/OXUB R|W4?v@6 %Գ=\t/M=e6ip t~ %)om=i*no -SDwmgÛ(@>bqۢ]l{]Do+iEJ-bɺX -6Lyf3a.c@ޓ_uʉ^bѭ{cdƁofS}(/j9d-JZhw6y/+cĽ3#}/3A*h=?ۜ;aXl':.6Ŭah+Bwjn^XE\>:rAݦapf; M߇>\ێ.C8:I@[5YXN=Tve Ѷˈmz/MCG #Ge$DSP_PʠU#cԃAGlz=Z5%Pk;a Gyhvfm:󪫶yH΋6Zrꂖ9 ܖGz+]}ci.5VZuF=vʰI^GNnd<+5#J,wUw,$mUwԺ]Dz-k]:E/uZ:Cj>6wte|:i,;@m:3cT ] !f(NKE󀝒h wZ<Nl^N:!{k8c8ƦASƚxLܲd3iiG6a-Sp=6cT"f6kENmִRh ܸ<׍lMO9`F'p_ESky~%(E^b;0tϳ5/ %γ7c 5"}|yeq[WN8s1W8f\'-3n u.5@FbuaWӝE`ܡ _FJ+m$^M 00_ێ|PqH1-, kxeѼORgw L(fŒeTh`:Yl0 f$-M> @=pzc~פ*(VY"ҊX[4/?}}˳A&QOeoPh4Bs.O+ᗗzAH8@1&/ES%JN=Nx@,Z f \F}O'ϪY޹Mt3f 1`W'߳I'km?a=aoϼ vB(f.~$/r /"!YiM "D\  HeËX4og!X䟁 gW(o3θn6A{M44lj1۸7Cv_ ˖!^:)9f%2 1RP+8}{E? 3Fx=#8SH?E782T_xe*'Os:Uw.D%+5~~5lI2_O& 糖gv*nW-/=x݅ߠhgJϧV4ե/@ !|৵ ^8eS۷>(糗0__ x`ᯕu8}}3tfG_X;۶eӫG_ sKi:ypee9ÔJ+y/x6I۟`_It!8yJsx"Gy KNaI’$SXr KNaΰd͚?\I|(ũ!2?xؼYUmFewY||}1`te1ғ~ڻvk?y'}מ7owv;-%e2qLxT\d+f췷2s܌d\0(oiীhj,d,TK=[vj2smt9 uJCmDLo-Xj,+( a'5LʱĘ؄aw"+"%_S bcgH!Yjc$D+dwmHh<gxxVŽ"YksG5"ȭkF^T6WDjA3cLv)Z)wjAcTsHabHq<߇1{&l$Ld#Lf*sȵq$ofp+cn8tg9S{5f`A!F?IoޖLqzѝfUC:~7YMuef۔V)ݛzGfeU%u5be$ٚgj)ʹlk[q;G@T!EK:% JAְr԰AP)*#-TwIt `}Uei!NF]UeP5kõ%xHyOϕҽ6p o ՙ(dmCxUZ![qZ9Zwin1̨HAE -f&sW[>RQpV{9wپFcWO(;DחAUe/%[`V~od}e_t\,}<2QOHcOI>r/;?φ>LSS=5ٻ޶WZ0bndiqWG$5Y( ERsD 33g4th_B; ea)臇IKjI4 3>zaYTx2-t01j0Mdك6W U3BLJh1E0Q V] @I‰#!Ew7@(8,F)/iFY ͲQ>{?2Жl۱vz$5ݻع8J (([ qmUn–9lتKVI_v>̋ß+~۴=-~Yyqryi|>n?zUw9}iҵ]WշFigI(N7~wF۪ՁQMƟ'.->cFwEfQen rۥ r& E#}fLMۙhFj/$7 HԬ-x=VL)N/SHUC֛=g/Woh3_m<ꐵip5vjzab埘tKg2kj9^ws7!$_%" <5623gryJ! $"dzNwGC|Ka#[ݔ U<`9!9f찅&֜Fך"F%جUk a9e5oU0Q(؈y.fVUǙV lL`C@ѬtUEIbISj93⯮o{P)%q;)aJ;`AJJߺ4C0 bŞO$~(AQKDHҀ)}[VX^2N V4a,a##F{WϊYE8a)c2z)(bFЋ#"Gޢ3Z{e0I ȶ|*ʕA nRqѪ :QXo= vx޳γw7u1VjaNōWq4};8(QY5`˯?kMnۚޟS~醿{jY9όYr;-;lmY5fӺ1b] {q5qQ(>rf/mCadO'@Wex2n}AeifoSAژ Fփ5BeR,lGB<#_< nVdҹ)]2?]N\M;=&!kZjk)M\Kq-qQYRO\1DnX@Z&` I6 fZl5lt*ì)uVFE'E'~#(uS=%i5JgABky3V0ZQK⑯|%fq$9ҭP ) #/"\[xiQ Fv9?"DLyyA! XŁ>1-z\۷צkiDέe%|2BڳF(;-)x.rcܭ#̕9ΔHZ I+|[͍r3;u TH 3_}w_ݶ;ɨ_QIWߧ+r1vjb/ˁWuEbRpٷVlWPWCZٰ->݁ 9N) 3=L>㓫kss>RXm'xthbT4 ꈅq(E6E]#ΔL&(j(S*49> ϊY"vE}~#w\̕V\j][s6+,*qfӝTjg2]ۓv,9ܗREHym@|p.9O@\GOxZ#5%VRW8 K/),nSY"MAiNRkEJ5\fԬrAk"tX256J _Q8ˈ)2P`XCC %tYFPSBAQPpbpbh[QԻ+2(Pt(*a55;;,BB2ȃ ݮ MjlGya۷HHdq ![E(05(R/ 2BHf\Bp!eD1. z2d(vkJ',zp) Rӊ-3F(HeAPT CXA]SBH26T:dİlS@.Y xb&^<\1-gbhYeX '*N8H,CfmO4ue\ 'H&N=h"bԎxfRIlO\RIIXL>oflk߽F6H&S K&R8?&M)!-02%bڄ*q|:ߥ;أSNRiVY+ 剣V:nSKC(8w6 űZEFڷoا-V9˜MJ32&( -qPˊ4㙅'jWpq%-éʸц2  TLkILGc2($K1Zd( [Y%}ûpGfJ՘ xRe\/hnd1IfAYC/RGxx1wt$ ZXIČfF 餆*R$ Җ #^4:Q-} שL8V2pi R b4i _ޭj8뽆ާl%G"eCf2??(O O|y:}_GOЎG=ဍ-q:EWhrۋ?G7FΣqr-xT|OqIxT .?G.An&uQ\!ʼ9Wb*ϚJQ1{Vs-*_˭Y~8h[V9fx" 5(7hհ:_ 3a#J=Yd_\U|>sk7#3ЫM7^BwM$!}lu 5)ۑV(m>Py+7\}+7`<}C\U FoKZ@vAC%F ΖBs@ū5lwUr~T9~5S3X5O3ǯNlg_c ?u뷱.n!=(+A(Ms ",4|DxqmCN F8Fuė1)Yy0 RA8;+C)xi#+8.|\-e k, [9;eH Rd NAZLA@b&iSSuM4Bjhj]Ѥ@'[t8MN|w~UF$OW#ʟ^=,a{Sil+c}Z~ۼ}V5eW!ބ" ƏP F`d.3#Ij8R:Kb`hCB$Zk #%sг?lAev_ٵyu<%$;sd ~hx<~е{Ŋ2?@Prq>23i#]F:$>p6}O4 72}wj9z v,l o6k@^ᱚ a45XOVh = ܻLo=0G{3$G{]bU)c1[.8𽷀_ߪ$?UQ1UmvnЌJrr6Xa\nB.#JZ[ZЮiId)إڕm;Zu%g*HTq9n,EbFLLSc9vJ965ր*aDA:_-f/܉^44{q/H*ф Qdԁ$EڠFtnơo(vBL9B^15x/iw+Q"N43`XF G1dJdQðs/xތtk1J[xk^i)FOoJQ7#kY p"U=yk~nh{bmM/QSn0-'k 7A͐Z`E@="Y =鈊XR|+~^ք6GF;v=6RL#6DQ6>` 99b?=bܞ9I^єUFӉϨ8.8LǏ>RYljџ~~t͂r8Z4C/[ciPSoFVMܶa-˵e\5Bl1*Kewge0#'dJS~juob-%7ŵ_5LpMVx3>GZh-UJZπnVْ'@/17яz_FǏŷQF/fO6㩒<=D-89S˞QCedn''r׋tk5G.TO2˧p}7+R>2Wxx: tw7f*T21Dfzk>i"O}8C/"F9,߲>K+`g{ T_.f#D8ç4yߩL}ٷ8f: mգwG}^#DmOon޲6kpeE²2~t$l6W:K~9E= ),"cw[ݳFI\呓Vj[蘗Cݬ޻qe@>wll!;;sq:9!Qj@1OJ~l0;%ҖGO.C'&"*|`Y2++Jє5}.ޅD k{ۢ.#*Ԉ+usc:N"oeA0J4=4}lnFG!a6aAXX0: h.MN qXTF7|FHppCvfi B/6!Y{J͖PhԎY2OH$j.,O*'(>Jj @N?B_/
    *9Ȭ:وO>KF u'w(q}=OWI(5.t&r]b 8K{c?u[]. I 7)Jq$IyéC?K%Ɯ[j~kښ1@<$)CE ƦKMq!@M:qJcc8 uh?E졀C >!ܡ6 m032}qA%^^K#!d:4nxP(0^r͆`!Hg/do0aӝ!cTCPSBcCMm}k˰s6샘9@s~8J/vp x zpU|X(i?^G!@>j^P'zk  N u5EbmX"{ڶ{Y%uPS4Y xp\2 >ÍX J7xvGd5|4S4r3Ig?B $~}O]X&s4BgḼ+&B}jD{Ƣl}ryw}p(*Ǖ^?#떿 ?7OJ;McLyB ,D0̬@IbJc ER(.TfsmO|>, 먼|& oGM;>1b`R׼3߹JŕQڸAjnrVBl1)CAy7וWь;_X,ދW2Km9™%Ds#8\H,JQL,ϨSM01f*%2 jl/" g|_'/ri,VklͿ.~|{q]o/ ގ|ufnH?+D3M` ╣ϧ>~̢3˽Ă?{؛+pt+/kvxcO>OnP&[ ]O FI%'GFj Zl] VES ӌI0BK2͈s$a!.u" $UJAQFKjnh{roV@ahuM$?{Wƍa>Iv}bfcñx^m+H#+[R벩k1fUŪ_@-B\'v,2Ua0 c)"v=U+b{MAl0 <0dS njP [ bK]?㒛4qOo_^5F ^;m+ޣK~+FJ pT H&,xPJh!.~⒊B*V. kcDK9u2p@ n`SkfǙ-s4Mc>a  6蠍i!s!=QG1N_0h$E!\Rvy9 D-HK\ր[wΊB i.0 0@B8@+ {_@h& mkh)t_ 27-`+$Vh+.8ޣהZկ9V1)rrRSL5Xgfb-v^_/Z`%keq1zy-ial^ˏKcMKӯb@\(5qwq a*|V'=nsF6?E),?:S~t ).7Ĵ8hz)W5mif`5NY9)+4ӛE@ABךspjDSqc7z1H!-z}kkn҅2'C3z'WMzTz5!ٕ 3]KrQ~l2?~[t{ c 3zTMJr nD,=~Qo|.P9.e$x>]UJ)θ4ŰWMh:.T322,иl5|9 6l1_zǒSŮr"1Fm00xzmx{xtf٩4 ?}Mûw߇,7{[O&2¡W磌R?,l^|LIKj<1HϟN7X7>k6(Q"/-yQu^V]Ozz|7Hg7= >ay8%^u=\5C"9jO?E  js⥇\5t~;j_??'( /2Y{.Lw= LlT:̛S۽NwEp>ڍ,tAEיl /ҟ{7x8YT\HTn],yI Oň؞GNܽ^]o-y_Ŵzv}02ht30woj O$rg5+v#xٿ=%5]Iqy|Η?$.1_&k~X4*2^)j|~B!mIc1V nnEK#KFr9߳_dyL*-OIHJBb( ei$,ugu%GN 0?\1f5#HB4&'NrQ\5{D̍YZX2̊,j rd[OXQJ!F$3XF54kA9pB,/2NT 䩾dxJ84{9i00<,Ht^^P=Bʋ9B %K .+]u݂ܰ!KLoNZ `uώ,^.mٗmwoGzP=``q@&H`j_"' |'gОk5-|bWvIvpZ;8l\;X~ڰp:U,_4|UbÇGyf>9I6K! 4IJ Gr q"Lbx62Ix셍`" -VsiqLTf>#cP_Ȱ"цᕄ]$ #- #Ea`Bh(ڜ3M99m״@~9ÁL8MOGZfm}- fU`>=>B d1 rDDܰ)q=49EzdOzq y*6q0!kcrz`sb2)a'}IɋQN뷳k \M F֒Sm0ܫ¡47kU7 #Aq֘=qy媳/h8sYEyʇ%L 8 FCL>㙀m>N83<T k{0wL>(VsxMCSK\5gSaYgہG4mݾcWhVچ[Zn8^tuxYff5tLӬ>¬ȏ'+Nzkx a'x}$x YډESht25@cɋ1dnxf̑1G,iFoF3īsN巏3#a !U6{EhK"cIOxh;^(Jrwƍ|ǐ[OVmuJq*L㺫!@>NSܰ쟝YxЫ"#, !-}<0b+Hi.CDgݭ~9OkWSQ.(s0n:X->ϱ C/Ǧ x̂0 JsNlp"āڞ'MgE%fM;DbsN~ Whv<8Y?i-ÅFڃ#Q[{sr]ܒZU}92,D [Ca;7J{x7YBpL@E@s ⎝B"AH7}s%cY\~ob 5~$\ K+p2:Pc/fpV: Bst\؊׋TT k*;2e7tlWX{HOžOžb }}} P;[Zn5f*~??<,Ľ=$ojA =+h&+u!%:%Q-L^挝WM?K!}r_.:󺓟]ƏAG0{ԇLDՅs)]mo+ƬKӮ,,K_mE/;?g$LKFrTX${Qk{`vZV@h"eYvϙ2sBQuRP`3r L&g[6)(2)23p| pH-эY=a-66$a2j" jp3U.s}SϭB| YKZNdֵzDpP( $h 0A;m]sm]4Ir{N7iEwT׎^EC8lgy  C$kp0:X[mz?iQ6\ՑZJV%nkf*KP2ﶻnLGH~~G*_.aLيjoA۹ud/~c-Ѕym~䢡ۻtcJvң%XP7 Ӆ]w.v.׊aCSծ$9c!DSlJ+\:ݴPz -IFwP?d-=3л a!DSl Zqqz7-bb:߈ns""=wKo#݆Wn۔AiI:xsʼ_}?s7v bW-.]-n)J֊=RhPv9CЭYz[[+R=Zf˙#[4'&_O40/&P^^\ura j(QHuAI n#е]|Y`RztAkbi>YEA*KPHZck2NZXjO#V79U${c:^ s+pd:& ڤA3Mmy hi+!e@ B&w4$QB]TddCKg-X՜JRMH0QUN aUE}!X'v )pY΋"cas+6Rg) - ͙TP'ZW;0@튌v٨2  GVf5¬pcEU)gy]3 tb&+'fc]"㜍hv(0FFH%2T*缐Y]YsLVʡDXMYfҚYUx `{A3/F~d x$^ڸ$g+7*xQ9yƕS@W/mnFnRK` $tu T13^ mj Maü4@j2F` dK#,veG16:06Z +ShOVK!tKOP/%GKU<թX!X!dSh  0 plHcޔ_ @k[ą!K]"j\=6 Txt£8N8mHp,N[tc|0ɵ_A4ogIu]A`$Ko"14/áO#16OV}%_XH": åTR.,Jdn12R,@䜙åV^3utGh 58cǘ P=mi,Z32+bҶ>1 l^f:fmI M4ɦ,>xn,7&IFwqCtf3eM4ŦH7i]f$A~#ƻ͸64al^݆WnmBؔn0ϛ's!C^|Ld ;#}u, {YۃK$4Fp@C+d}iim@~|ȮY  pM{aW%9K%O75U C))أ2ZЌVs*e!p$> ܮKHzrHFX:*'0_7Wր9G4\Xg2\ K 8kc>[!y -% RѾaG#hbG"- 5C*=2Ѻ}Km@O~=(0;W9i6DĀǽI4٤S9^z.{ ǐ^eΪuVOȝ:jF(#-H15ں%؊f)+_<&F)ZXAЂؾ,T/X?LAa&Z(KYWWPjJ@ ( @eZ4GiIThv͈1Bfc&GejUiݼ#UgyJեeQd02VjF][FH5? P^l܌8C_"4/c6V  ˪P0(*Q,}κ7h˲TFta *}j^6x2 3fP+cjyJ+'jM2L#n B,֢WvX3ƉH_ie:B7fš}an,YL4Lc#2meUmz#[mβ?|!nJVWN?7N)U[GoE~l[stBIfd, QfߔDIm kD͕Zhk% Zݏ Wn~lS딽 ۦb[e{Y&`S6 `J1LB4d8#`RR Ӏ7dܶ!ӹ&,‚ԵV2iT-2S \@^eV'o[#!jža0Z= @p4ё`ƅUV[=U'17ŞxƫT$;G5W9sSRTJŨ2kȔHW"ظIFX'z|^=ʮG LBbd 9kY++sKEa*+)tI9gE 4V8,+ ƹ} da`J Z ޤ B *>k "k'\hKфfm8UH%ZXI 3Ь8w:@KqY ۽4{.nvH;*Gk':|21\|\BX Abә86 ! ZtǛh;thn0G&b$|hҊ, YIILGr|!apC!q# h')syS4-в%BKA;&Qȡ=($vaJ\b+t`JӓkqPbFiT +続t99p@9H X X]5qBlBf;MjܽvH@C3QFɫ9`( :Aj*U(-(;?t@qeW`?,y~zc`6 j|Ш(-p%נ,lhM 300ie`_6wj',I󉄀\r&']rM)Vx7[*!6a2^݆Wn)6GpLŤ-IFwU7wKZ݆Wn k#i7 32i7uOzv>8-pa'M%ӝ:U? m/9i 2kc,V3nco?;L[T}_/bO}q.ګm [O>Xٟ/|~:a!IMuݛ?Po~xV.,}Agy]gO4U<ƭ~OnJ_ޥj*/Ell2޽y` SifK[✲L;ߘsV眗EaJa'_r7[{#7S&Id׾zFH$z7!jY`[܍{u:|~qaZ/EzZ.]eY59*RD.6q"@ljQ:CvsMmի-r^]2 וI:ЬD/7:8rr\<%ǡdҒ#!A[H`zi!H>fRU_;(ʌҹ21 el 2'Jk]*JCd͵頚Cd]-Z2Τ2SiI<\,+][d-w| IPxJַ*\P&r6MiLd++ߊk.ݽ/+w@>.Vƽal5$LI3ȹ-?x)3t'C @;ҞrQ8G/wuύ8_Qev=$ArU$t[Im6O)?3Xv2Venɭ1-i̓g(rm1'Wb* i%¦ȴԶ4,s%sa`Q&@ CYdOObRas1m$$̨IPЉdir:+s)@ޜi 4Ȑbۖ%vG"++X1 Uz zSK"'HU)zb_Jr* ]"I#x>},h lG@ v(_X}x t+8iIػ,ɀd O=[]/? }2{ NyG'֋.}w1과R8kʅuZѠ]'Kvj-kEzP.2Qu =(|2[pP^UbqgUbAko ?(N^l? }Y=ݟ#QB`oZ?YB`DB'&hܱV3rܐ)AW8jZor_cB`b ғO1DACv}^'܂L 5/*,#`5'֓!:K#C{Om7ig?/V1ЯB/x!87I]$ Ì'p>SCQ.soة[њ懫j{e52-vw{/Z^>$^?F&S$B(zF4r{&ӊ{3LK!*8<}OKz@4kS%g+ C$myŒ~8eQg|* Q.,26KQs$xPșETE.>8g׉XNҝO^ܼsg_]n؛j}52L8Rggd1kU S ΑbEd;43\V2!cA{|?,9!+%-+eDsq *1Y@Q9dn =bOy^"h{6Xʦ+%m59@+-'_MX >wd4bQ䊛I[O-}`{]z͒/{$9e7HZ;Fk Y*I0Hcu?uRQS{6튩íuN(#, E6qXԣXoA}L*GD9Q1H&!47IT47(E_揻CƜU-S t؛KِFۼ~^,kk)/>~x1ܨV8EoTr[oOzp3Ͷ0$Ў:h嶵RX)Gm>'sDeC `QHgwԃrу.]Bً=QIiHMcct `gekgfk\1t}LQ4+nUԲ+g2M WvCb3u$?n_? M:R*0S)?ة)A'[Ӧ"q0Sd%.S}fBQs6qnazbɏPQkeNѦ$0%HAhV2'M0ƽV>fgqH3ZnqIqI9:ß:亇xuY+['*ZEkTpa䈐2fvY?ߔJ%6[̛2hax~F?kRC6~Я%\v.euIxfcv!Zia2Fh@:Ԡ3+Xe61߅x?Nf\__.ƕRL2Is&UhTrsF@ٻ8x҂;dTG1J GZ ͘џuǮnAܵc?4=-Mێ=.Eٕ&= ƠL52y櫓IJ`j>tA}/Wv,'ޢ88XA%t}dUIIOi146W%e@fMxUR&iːBfs"@d *{ &)'2X#ӡ2`4}?] ت۱P1|Ziq̧m.z ^Ga-D%L 9ﳂT֎1S^4tAW*S+׃tdnpUٷq13Ð+%PI.,WA)dZrq+}%se >kVeYrZ&eJSex%s`F@N7PbtH1fޒģ Y7 ;LܶD+%OqT7ѢOq{^zX13F"a$r2QTs@ŀL0TI UNOA ;ŁFpw>hrbޡI R2l͑kR>.+#9| 0]t`^$''w3wZH^=l IɹAR(bSѓeB FxqȒ)%Cib@)$o>DҶ+ 1m۫Ӓ}~;-E7.U]x0ZJӳx{=BD/X_c ㄼ}qI|xWdgzwR_j?-iYD۳3Zv|gFC#P1xc[F$e9s/o~RlkZcf UƣRwƺg%RxIv(eTw ֿt)Η x*$F4vK;űBzvyb졛n.2JաH` 졛-a' f)D꣛ey,K$uNyZ\ɺfJTnj3l̥oLp' @&qRl/L/w,enJ)2ddtS=dA9h+nmeh \I ;X>#+:5KO h HRN dAx/ΗZ @X+kq's5d5MPrgZO妡X.Q)DM6wB%B>HUR[eRCZW@I5li瞍:6z`RY,iZۄlM^NTE `DMq}~u]e(%;527i~z6z(qw7Gu}S ,6Ʀr9֡0}r_lkՌFack-[VrJD(Sblͅ=NrW`Vf 7B+D؎71 6\£gH VNf~D:5ӜH6':Ŕ6V;gIs#UЍg%)6 6ԌYL`JMC,78#d; zYJfΛ2)eqdNMffn5cU6^te@N]0N*Yl] qWUTc LOiI?HS߾=2Gr BFlgas&y>~m,ճh}v/zڲԓ"z3$[Sۇ՜\uuF9ri,yU\IxkA8bIfv/I|:I>U 5IG!q:e1.@k Bx)p b$LSqѾ _Ex}~+)G$/eeIt$ X,8JgXz"6-+Cl\RB`E{՗s䬛ӈ 5 hD3YTx]1b|^kS(ʚUyVuo >?DYoYHxK} ,7 .q(Քx\ęct y-DRevaY#z%Q,ъ H)gM5 k ~cx oq2_mLȖPKL?1lpɃSs-XDXX*=!bNF0c! 1i lF(*o7'Irqɞ^${zEݞbOt4iPDZiҒ8HB X8{l鋊|C;hQ(ԍ)uRQ?20 H>-|[l]4½$޵c,yxx@syD E|L=4VyM@&QofLwQf9< l2/Q6kEٲue3TZA8Qa,u#3h!`-Ī*G$50@%e߁bi]يel*Sj@Hc_MxN*B$8WFF4 iHU{Fp|B@ve+y1JؗeC`Y34*]CF">j.B;"(仱l2T6)i=<͸ܗ9 4.B4Yi< %x[5 kA&'coFيe^lL0mcBI{ 1O87ڃ胄P)w(׌ӌmI٘Pl2P6yqC{HzUeuы~uUq~?ӱzBq|Uh7/.3wr}@5 6?p,B`6CTEā,Դj"⵹;A7|Mƒ߂2^1|3 ?L["tgmdb/% N1o_F7;=46dHϏh{^&HyHY><Ԇ7}m')D2 /`]~E} *5F%l'oo݃ $cQ\6<-=VNHw^˞rï'kЅ_|2hZN)NEp:Lu:QŬw}{gaS9Y< "͜]cPB]~{WH|}<:lxqN\] s6;=k 2{?JNM1w%q̾8DnZSdgVRe:fS և f75_m07iZw˂/ Ji+(BLpɑK'@~6 /W`az'1vO 4lIJ!-EH5%А-¯ޗ'7BQ "8}rkS,|U%bj_/[ (~˟rE_pz~\u`u}i ~vR MNN*5P_%)UOnVD9%CX2?|^f %^])5ѽ4ٲ')?u0Rl~ef_ER2޵$&46aLsܱSWfr5A\Ic+tO@U)koyx;1֤GWSNuoNs:OMC{\`4AL>zs :h{;N;} v#\V'c,vC ,B\">|'KF861+01p0񎕓-֚sޭUaf[#n]3GZËI~N,΀B>gLzb8"iNzYV($)<*B9Y$9m>&H9 yB)L-6F! skg4x ҒG4ywv )٧`FO;C3:'ưҜmce!+`Ļle^\x<]~7J[:)hxJ(D@k ŦJ[=k?i5nF2}@SǂCY{LL A-Ð 16H s0ʈ`Xe@ lļK;8܆n(c&kSusFdP)ibT~u!gIg;==_c q]Aqì?R*pϴWbgLE KQB76 CZ/Kk"%ER6=O^ݧO2#BϟW9́f09?߯ł;%=P߆ "*:'q8f43%$i'Wご& 8=,z iv7{VM$_ocxTk㸠3$,%Pt9IwC9ڈvK|ߦdAcLvh"Y0B2_!n)JRsaQoj"„lXfsnUEb"PQIԖq,!x>4utbTEj R5]+nZ#ҝU6,Qs=egAMɕk:>DGg6+Y@ׯl' )b7秕YD+!R)IE V7w AR9Ϭ[ z&zܺFM =D8E=[;s|b*v82_vh^uU~ LQ_'oIpQkNЪ $}*]Hޝf!M'~~˿??Gp&G}3ؑO>|N&)+]`PĄ|/։qJ_2gfz6ڳ;lz1;kəqz"nD!M!_ii+8-4g]-ug qѰg>+t\ FHIXgҨ*XבZ>=GIzFI <"h%wri*8~|:)xR9彨߾Mۄp-~5>\ؐb첪S!I kܫ#؋/WB]ֻɬn$헠aҔ>ۦ41TZ4L!2L /FL֧gvtM.<%Y=\Ղ%(ᗼrNLdn;=Mo'5U|kǕ2A[R_Pô@Zy[)O2IG1A7O@Y0DK{#zFHU=oYGTtFu05ҘuvqKeK8Aoȩ.>Fe0,z*:uص/K#Y隹lVv}i];{ #Hqel/0tQZxƾk΄dѴ R Wb) XRL_QE^"f+{Hav4R}? * T;UY+b%%B‰]9yi9nl@$T I@;b)3zѯ3&W|j=,v5Ceadu,tYMD p1%^OkK;/!bV29羺b.9d? d-x\1J0fE{7/W`gJ-4/l*R70{| Rݻ ;q;xEAKf^ <.vtv({*~9}宨= ?]oGW7Tw[?,b'8 ,K|B?-nlj_5)#f8!) XINwׯ2wP” VZct)qItXkQh[cX1DH!Hf׾S4lg[UMUGt:GR!a8]XLnE\3KAEw Pؑ#vO:Nd/[ݐ(@cQUI!rDBqr`M)c!;(UkLۃ6 .̛PO|i8k4ZHnƾ`Oّ\ʅ%JgG25|f ŽuƬώ^CÁsqþNecj S=c)}=J~N")-0Qբdz4r> g/]z1U=4&O"V8(0j z6%S4i"Z_\6x fRB'iMb}k xYH8`Z>9d"pʹ`b! 4_&9w.eQQq=B=]a}0"yCu?$O1諴t=zRISSnmTOp#DeIV4 '>>Tc{Jpy VH`?(.y9CW@*%{rCw$O{Ǩiʞ. ȇH|ʌ%{ t5 ]C).@zUCI;50lvVh6=Ci.(^aPPtPpa mC9AGjËpbd ykCCF)16 9jׇك5w47+5FAr"7⫑bɌCg+ }R %䁩 Zb =oB%;8};o=CC݋ w\~{HY-CH 84҆qj qZLh<2QzW~ײZ- iܱХ|0 ):Y¿=goΝs|:ra/wJyW|m'%Q6ܟ_o~XI4hS۵Kbtu:~դY_ܔ4AHݧ2A}ѴҘD*F|o>k?>PزR ^W.Z NFV:h3k*Nx+M@v)zfZ٫kwƟNNP@o7|-u]\M%C_15N}M co;sSH݈r"K#=-94h7 V[d_2پF%k U?ڵ ђ z*c㈀{t bz _5c9LKj_h\^h~Dzh";}^zs<면eL$]\ $X`>y׼M9_v~^04!=CHm<YɽcQ 7ؑCܷ$c{#}CDZ%5Sv2< UB6t2JFAs4"h͛n%U:ϭA; "{P)8#=($ Y[@aov͗AߚI"VIt4 q$%Zɡ [ΠME>FB/5P{76=xDJO` cCkLt/XHTRF3/6$uƊ vڍuvBpB50ںP]s1cWu kHi8| j7} Z´fg#KoX-wGo}ǃD鉜#CWPsRiВ%:{6`j~(W˿L"GC*TxB56MК~ܣ@Fua7YKc$Z 3JpJ@gvi-k5yW<>OOYMQ?7g?~^[7NWb/,xfL8H<ы9 K2\dyfݓd:q'wI'[nv|m'!E>IRJ:eY)qJϤ!Jz3aK'WNyԧpE RtܰpHNw^Dke%3J' `Ȥ,cQxed{SVR(XSk$X7;kKx׷Dj|?Fzt:J$+NKHKG9-_7{;;Yd8QV;VYBI?[PU**:rY5nrzJ0P-EiqR8 iB62FڑSi' !9tCNY=!&"3 HhF9dr*B 똜>$}?2#-s?2#-H됴mV$TzP+{x~?&*YD Kd*er CN4-KHV*Q" @D5ȚeCI&Xj% YCsHܽ NC&z:d_UVC>`GsSIȐa:ΐN$! U& b"A| ^P!CCyPZ%v0pTi~3AW m:Łb^pOąig`爱=KCpJoH"ψ 5J./z#=[ 0L=fWΎ3I6/S",DN1eytbx%T1Z(IQ΃"Jh]] .^{9b, %BJxxCeR>:BP6@hoa;9<_lpw8 RN}!6,ڈp²%LPb/\hLVP%1pUVyW?}9U*Ҿ_8Pwv2]RAEv*NxSR,BgӉ^.S5!`^eupYVgUup3*4xj4Hg%{y>r*FJD'3!z.J<- W%d|.+Zov~;+8Z|C!eʠ'bZ5+3$YK=;3*RܿvEβ-rmlUm*;;.W^pg%B%@\GU&XR(+Hs^jsہZsfg^?m!ﶞTZTTuu:umףԤ醂RnV S+rgb2ԣ!&," tN!rT8(jRQeZoR1"#ԖI5"p" ld)P@[SJ#@#qJMJ[/(P⍓ r%|)C;\$h#Qjt-h4k{ֽŚL[ ϰ5nu*nSb<2!F@?T2C)g[qƒ}pFX?I6E}+Q@*?'I)d47=Z1 4k"x!\BP>xDR$ $qAY 49d$ Rʈ.ױ +aJYl!6;jI&:TH ='C9= R˩Ґ{Vz +,{*nYJ4y^\W!sPɢzZTEKtm*z׵hb-s}! P|yTm^PaDz֡N"<9Iw# Cz 'N ;pܳl~c/wXq)߻YE]6w $~C^1ajz7̕L arOׯFhGȒzIw?ŏ?|5*f6N8U߁19T~vqq1f F_O.'c*$z\& _ݜN>;XN7__Q/9=9Sd^}rt~;-x<[t;a** +knHo`#0+j} FUV @ e>Yq4P>Ѐ hTg~Uv+Wb䲪%$l9qѠ!$c-L؀F'⠧Q$bwגt g7ݽ#/}oCB2:DbYH;Cmxhbq, #miBJcù<8`8Bg,괬d($N`~7 0͈RKxLQp7$uC r]QRxN7_u7 3ɳKgPx4~ZHH.L~Y'vHP\L+!tXL 9uP Kl-gZEQFr{'5fr&(86('o]~ *. ,m,31% VO0\8Rrv6wAAFsq"u=xn/`2rYwߠ0xNJO50`lH &䍵 qPÒX$ #48Dp]<3܇*-7&C 2y82ZJJ _fB Y.Ƹ_p8F??[<1ͯΟƱpwJoeExIн><8`o)|ۙiP/O_+ `͕`hǂo="0ڱL :u!Bk)E+1M!/fًTo#nv7@MuڃWqV=dVTobCH1+F**8hX V[ Ee)ӌzcdR DU4aEw,i|nxo=:٬2DY+YƈFֽg^b X%^)Vż>oNqx^uw8$z_[6+b9y؎p47EE*++%N2kw: d|oaqΖM}n ) K!A66D(pPpoke#weoS'9v ߥc/ f%4\q.6*FU,ndb"1FB\j-`I2c5MB[Wx*B;uʑB;QYbj$/09g[J+,8!C+;wNfߥ5;](IHc--yhPb!HYUr FaL(,9( }QBE_SqQ#boGMSR!EbhH ʮΌ$L ae2X9ƄjuYcY,RP] `%%\@a߂A‰ŞB>K)H6S #e "^r,3έfRHTı䖘Jr &N,ԔKCQ dJ<嘹So8 ,I,v6L7ɬoܶڟ=*+o3;o"e!?AXK*R%egGLnFC%*'&~x0\:c?_Foy7| :Q~ĿGxTK )O.d % Z/bb|vmO*3Aǎz۶/̳%rwegq;I/5Oȼ W_}P`Rߎ%' G\rnQ|xx}.v /y}/Ņ3tnH sW9u^; r&m63 Ji{I7$z8E]5~S' bo,\x_}f|7;1}g·Y0tf ͉dP5nbsu6aVj=ќݽ_ϵ['~`NO{XrL5@x l\bIn;U555^N5\-]_Rl2@M6>zE.G5a;}=mL-)DP5E>W Zq: M!*[ `̠߱0I\Qx7PfBfϯo 4Y)ry.h]x:/'UzQ$GPy4֙:i+8 >1x`ҁa Og;%G=m1(a{y+JkrcvY`22C{/띀-Ƃy{aQW63U{]%[@we[K'gٛ[%i] GG$a |wa{2A  `p> 1?~ЯF=/BW4d=;1+H@K^h3qnb· ߣEKK6^l[*= @SL/eS]qI<J= *ܨf*D-ӆ^"VY6b!}Tmg 2JpBTXeUW6P)]QA0tEhC0{&ro-۠T NbMiHjА$J\0v>!bVo$bAqV lrhNw1P劔DF=ˏ9WD Nǒ|~\KSwKN0K4YAgj$A^Ɣj$+F-{?{xs#t??/vٵԹ%6l@q}yx'nyu'Nيkn%Of9]3ɱ>pr0!|Pg8,eWTg\؁*tFuwQrm,ױoCU=ch+(߆3;-9k98U]ݱY4? 8Fb!T6TB+[fcy4=^1f:k .U5)HXn}һY?[ 5śEOIy [00aL]wD`:U][/K=RR>TŒ_ eJk}ZV,'F.{GS-c'LJJTW"D dI@sHN( ;ȕ%I]1N+>i"XϿ~S*_*(W>/~MJae3J[)85S+A0TF5zѻ݅dk*DICu3$ܣfw N|QQ^y+7; =͊Re#%BV7UÁ/58BPuq-8Y^}BJVfd,}!?p#Iv,z0՚Ԑ%xZ@Pft;vqۤqK*0r]_/ŭl%Bv]-CXxp*^BT.㔅usPל+JdP my:եs_N$hcYs|r|↚UjoVXJnB|DKR3 SYe{†L!E~栙Tt~NÉC8RXa1{䪧.F $!D)gtϐPՅ+(ov /Yh` Ț =CMn _i-%\0 Ӳ靭rJ<"X?pNh5RmY+ "iᚁM y 7ZPDӆCsk.0Av\nQRT wzh+LO>'v\.$`9b7ݫ`r5WW_PIWZJݔUJSp|Uc:5^HwN8#yˠZf{&Fdk_@ĴNeQZ]ho6n;(ڄ[k$+[.To[g_Nj>wYČY伡]3kЉ- 5я߻Sv"ԍ+Ѱ ?VPNzrV3Q^tEbRM7։܏=\ʓw3s5Eլذ'ոO$]*HPNݯo~cB.p?v.g'D07,oP3WtLE8\p +tcc 6VP@C{$1"k8%M8WBYi5"(l@h!KtrBH¿@Qy8*UV `KÝ5fjK,߆.(Z|1a?la磕 /X2C4hpyk^ň*9§>}~o$U$P `#=0İ2ޱefƯZ"IatA x L30vF Vg ,Ը/ ;6+:l%#D5+J>pRF;'G| wnihTvB^).(8߮9$(!$W%U*U.AN],j rv""Tw=I L!B c+VBRJđ8PJ@ %1 fDZAE Щ,±e/m 4Q`UBRgNFrd @p10( b" VRvE剢st|i2$*!pc'$LDZ˘a!Jt@ GlcarpӨM Dyje @8dVrji`9& #*P8ROlV+"t\ $q/*+Dk *@Z:Pe#9Sĸ́E*""\1J(T@PH4=0<=0;+p <(y#"$ {ZF,5=e A0>#¼"YzJ}fJLf OOKAzeUG~埧 "zӛ%=CXy潯/rRtbܭ܉y`cѧ\ogp?Mfpdl08윎{LVO eAS~tb9& 0`6!80HTF0K0, mHc I8 o(*"ol#_!rKŀ 0sqEVmHʉ3ȿS$-DTUh ؔ]uNS; z-W|r%a t\->&KʍÁZ ^ A i< &!i<eRI:SLs;  "CXvwܛRDqsI{O}o !^$/ho|*UʷkhN{<96)߮9\4;ܢMw6{5Ԧ=ZzC혉%qSfFO7*)rrJJ#()rQ<ӤEs$a){)qsJ>nN^ iV&0)JiXcVrz&0M9b4t1A}L1) ~]4\5&Q,p^Ggߝ\mj@6DZќ\“Yrİꛨ]s%t~bKX[%=Dą%rU*%$;_Uwvs7x*L`)N۹`v9eK')!FV Or\]JibhwG1Mhélc֯Qu5j -F%H.oNl\jQpڲ{2?][܋ٿ1e04E')yzb6F((T̆hSdn.K11y^VkD8bSFtF=!r~sRgs>^XXtlwf~s&z*1k{œVxOH2HnH < 0E"1D}iH Z(+r(J}IgHP8 Z|+r4Ov-!{L^eqW] Y$rN%31A Nv[1R09&wl[iyjѦ0B2a-@D0ݬm/lM{q^].+S_65bM͇ئCMvq|> J+)¾R`$V >ѠKoVPQ~м[S%bqnV]ʬ.3Hrj֜jln|BX3 ;W6,lßYX h*VߓE-'įu)E&_йQIhQDžNTe&i[Y1DZ8k. R L.:#&e>5"Ei}BiJʖF:{S ])b 8/S~#n$.O7e_YЗ9`K̂VBTsi硂߀2rpUe3aWgP*`(EnIfq`IgJt:x!OouqR[לԬO;^q6&&4L #Yf1kEzt(ɏ)aawHQf80ڸG?q)Z2}(SL~|b ǘiT7$XgtȀawU h*.!OEsx 8h9Ms|UY0nf3+0Wܖ.âc 1H*mIŨ\ >m D7bwJBj c! \U0m!LT >0+Ueصf+;zQt.RLm\ 渦Sv]Y48%nծBrq:kn#zs4 H6ڭ ұҸ R` F睤^b{7#zTήnOPUGrbr`Щk0J@JY&t6K+e<&owvasq%shcGT_@S5\(ѐ'΢<%֓7; Sn8Ha5h(kwYnـjY4$8w\NA urhƌ"Eil@S[ y,Sky@`':b8]RrQ9w+paӦg| 0B0i> +nV-2g$*h)KEq4/׆8)w&è5!krFn`|8:%՛˫|v@Gp Q>1rRc ̨^q X q>NGFB-R! ؜gZՏ$qc9"p])r4A2ɝVBR\Iđ.q"8qP3v 4`7obŇa07Σ`g|[Sq>=sK/Xۋw[_N \_-yj,ׯx 1ߕ9As׆,/f\,Wl'~|/={11Px9u0ݬo_z=/՘N-(~eNn͛ƿ/q-NbX^é93GnJr-NV%XVI<.bLA:D[g\kt#`^\Ӕ,Z$KA7!! A<(KBJJi= !It IZxAe:Vk/aW`SO 7;U,Ğ}wV9[,Qw4!J{JG8cjSK"L%8M5gX㹶` >t8qQ#k]RVXw4HtIjX㱐KHW`HG&M՗D v^s{P9,MhvyNʼnŇPpR ۶6{V-iP-j/K'-@˞hvV5kP?~?B +U2О`jVgфpk 'IAT`6Rd_ER{dy,SGROq;LoP>` Qa‚gUbɍ!e%:s$8dq<˳c8tZ+m6 T4[Dw4|dgKl:[NώNr@DDm(DpՂ)fw/F{da\iɦxb¸FLL'4eBcFq?[ve<.thy'3~yzM3").>0>j)oh꣖bBnEخiHcu?6?J ^#.k7Վ 9 ~ՎN2HjFp+]%2rQaL eOP 42?zV66o'Gi"!:ӎc?گTVI^v{IJ>힘޵O(p5$'Q"YP,[K( C^Hd{l.> I5)!qFzn^`tus~(U\A}Y_d~E]O~īK"' ͓6@e$(dGbj~XpFy{T .f7"XM&NF3ZP43icQ/S+.(viֳ J8ZpwHv bM0J~qڤo>V:O֬v0VK֚Ni(dz kh;QsH9 dh<`ka Bo+Vtm+M:QEա_T־uoAׅ{IL]+o'\JA0rā JmEiWF磴%0%Y][KZQHF1C(Tj^~CKu+g;= [Ktn6cqyϢq1?v]VmݛČ2aGQ9NPY/ُ^GG'l%jdХzrb'(iM A+21DKu Ҩ9l$/RJePT^V!ĈGfE BO"$itfVMV{-,q9J4tB^qdhqS)Mb["3ݕƼKT#%%F)ͤIs#[jD0ه q0: o@;Ի !+U 9:7 )/j>yB`oA!G`@r%e=0˄`pyqyqyqyQe=˙%˜,ɺHC ]"WLDɵKdJg9w}Q,/_Zw_|@Rx]i"b3"x>9:Վnm<2# (s]p) y V%5g"X)vNG 9D'Mp s "<:sEg8ޗ$J~ U$L "U`inZP܁F{&'O% >vIԙ]0d ٛ$L@fB* ~ ! jlSu8pRS5mH a%Qgbv@*Nf^&]1%B R'xx6pKJ CC[b(;iof'jzowiP P+gIy/,#"( 3A!X$Mq3$Լ] 3G9%e-\8].u6@UDeְ \RE;QѬ<`H%Q *A9 8a*i$E<ͧh $pTYۅ1^F1^/ z0\qDꈷ"DT0IE*fD>j H*2|w3Soi/h'⣫zMLɱ"ˏ|\ pO(毙`fTn~8Dگ}!ZޝS|;jR ]qO3 Wv4n5W1*xr  Kzһ+/yls Ѯ365ѓ;F]]݌>t), -GhIh}!VJ|Si(beB(! (LR-%RqLuP\mMt̠ VU FA@[}U,b-(CFK1B| i\tA"X6"E(@Ammd3H$&hf('$L)^7A/W8g$6u&l*lp.tSnx7'M\2NO6qn :sʝoF{5FPxJvZ|Ru4>|ɏ XRyhsɴ~I0ݏ4(OϦ)8ǽK{'g@Ј?/ozt55(^"rpcZņ9-6Ài6W۽b7k\י/+- &yK(x0r0Ek7X":h*BrE4mle.MQwqPnD?EJ-Wl@76$W ڌzo5qaRp;[˜ @肴M\AblF7woonvB ('Cv{uь]S9g(P tz (X@~.z|eb˛q-qn֣mtlMz7s,ݠ9K/pp5f MH@;1{O>;=L gZ}:(Oz^7ك4(cU6miNqH h$SBɓS+q{!.G~&PM y%ooVDN7o60:6m~cWZM6_iՋX 5pn9/on1H/-GI/L})i"{"MTkYy#Z -{5JqH/VœoG*AP*oR-o^W+u7iV^g:io$L+}fuVtu+0oCtȪz褫|bd x,ɍV+!%+M9ƠHRZ_dTjB6\F"yD!@QzCKpl ;C4re%HMC+S4U:huEM>cʹae?ڶ$8p+5;Q)Q5z"쯽bF$O. OQ)݌/㯆}U]t4lapr6tw 33@%9p2cHLLp p*,%P˘ >v$ޙhׅ4= \rɌ$qr#  :NцN]ȨbmZ%#U7gjrKES*.%j<1mREq<5PBiΗABKuO$Ut|Ɋ CƣG%ܬyGEަA N$4F)$&k9,k0&w\ZХ6C+U!-y} Ms㮹omGS+ \/+XZ=|Q;Cpyhxٿ=;=' T-Pfr%ГF=F9Z2ud`?\WY}H5%dk$a I+=E̹s!wMho~v>q8ywO3Sj ]f>]|ŊOMFe';p*jέM한k~z\lS}f9+=[_+yι8HI%ON7j')^d\Pia~/N|>ܑ8Bw&N̠ADع5jL߹ꠢ% z a/>!oECEj=G;ic?_7:ZKhPJ}([WrQfYk,ÇSY3Qsڳ6;xYaqBR |Q?W 7?Vz᝟tjF N%u#vմ̴ܙaO\>'~mU?Bh4>yk4="?++}~̅[kǃ&1PWF@whY/q\cmCy킩z C|ԚgSuAt:tۛ5y*[TRrv”foq.I7[W@;H}z `wҭ{ZytB^9D0UK@K5/&Q%dӻh3zzû9t4}ώZF_ҝN_}C E_VJOQJ @jKd[ @j #I\zFIp{uXg 2C} . GU{oУ~|Z>oV+,kU!^r%=DP> , Q}nӝ|fT__ oQ]]S, 'Y]x&(v6H 𕄔wC;%[ǔ/Q\/{y33"`ų`|d[{5ߖw2 ڑ^[sgE|ngӁpX/FQJp"$N(ƽrQ. J" \I9# ]@kYw&HM`U4s_*PDS8kIPz(-M 9`lenu* Qf]jwuG]2t wǏ(YCuX&{gG/7U{gN,md%4ŢNLzd30^8u 1Z ۯB@)S<[,DԪni)>-t.pg/E5f. l5M#)3GjϏ;Y~E+qRtI"4s?NAͳ]vXQM)W5H愢;*.ʹ5tkJ)Z*vu;ߤP!7@CqUhszd珓K0XR#6hPL(o1"H+/'ĪD47)1JFTHAeQ*^/%M i:^Ȉc|Da&#ac`A M)ID B@ i -I`P+)1Tx/̀R9yP2} ANDJQn3R (4 gבȀ2!kwWÖ$8}q B@},*A3hH8sŢ&29eκ$^Z]hJ-lbUTm*DQ\0sĐ)/m_R! .-{0.Lx|TRK WldJkmF0_{ҰH  A&|`w7#Iv[%K6[lZcej~,@NZ}0C"VبٻJQl5VXcC [7u\=ē ɍ >w[خ7Է~6G_LO>E-dSWeܶ濷-C18>]V83 䖑D%:R&Ȣ(twN!J0%.Fb)TvTK-BWAQ)+"T&ԕBfΆ21t"ngZ8X:dojZV{_Z%)}w8a>]±ri܁NFɕYT;.F88vDzcd%P?Fz/SUʥ|З/=2r%H?6J& Wa*`+[#ځfVޖPPŠ'lD6$FsIQ7=cԖ\3F*RW<ᮿZV5O#_''\u+UP(+mPR0U6^WC4z<q:5WFt=J t'"L"f-0})l)mVXg;|+ v]ȧu?#H~Yn\em6=62-m}Yk5K" 7;c2&riA;o+N%ll(E+mjJ:_%+mgc&il mĖNg`h!=g:7:Jڰ@1CϢ65-;tv[R-arϡZ1POMiXq=,eS$8XkSPU2`@%ka Z+ןúuZeuG+HM_#Zw\YH`%ڐ&W+0"f⭾XnTp]N#@22*M{VùuvZs>XJo3M̨pj9 Unm9Ѱ#6s%syy,F[.R;FG27_,QuW,L yTa[.R;F#eJk&j.!)`isjg۰y5.Ի>e^PwwśEv1x2OK{{ui kt_>}BX<퀋 &j죐djk;ݗ8cEo)>3H̼1B*`e` |7\WHLБA_?}"٥UAs -nb:XxO5a C1غJR}$rX+-}e;r|_>:2+ Ji_k;cS-V-ǠwcfI1?ݻדQA ,}FoF'EcD{]xK&XyLf1`C1CsSA;4+n1#[Ѱpyu]<[fd>j>!`(:(S͕bߎ wzѐeͶZu*ՖA )4sV{HmԀ!)S?n[.R;FGKӞޟv/ݺ@+h jqj7vA trhx Jۛv˯ݺ@+֘R1b ;XNs=? <1 3FsTR1`Eb5q{Ka(: 5hQQz(ĨnQLA9OWV((=p&HnAm[Ee̚U*Y&W.( E$\c DUyX2EJDAxt  ƣ )|VYe2/u(c^a-1SB! ` $ <^+U+*m%iS\ْ^eY-oT P@0t,/6He3VJWX Y`Q42۹`.W @酓$(XEf6ҩ+iˠڀ6TyIk\KYqDg+M ס2]Z)k(=hIC)`>]7cC(=]T(吆R'ACJkt(Ru\%AT`JkGOaT4:DKJNCiM5R&4>W!إS}JBvIUj26HCiMB~ԥR.P( PeJk>K¤TxcJMNZ;-.%I)j4{-׮Br8RF*nxQr VU[D&)\aC3TCwC7 •kEPӖ3,+(}A㷛Rt:% ٥r̻LH`'AL RvRo,4b?V^@5^ruYW8ֈD.EWk|>:JÕPzሥ҈كPX)`0ʫ›^nV ڂ(K(HVSX#4{_)D}. E_lJE\#OB'=5"gg F҂͋Kw_V:1S?3\<*!%o_{_haEծ~a~ro޾3ލW?+//?9dfI4Ob{e=Vf%6k>!qm"ͳ)}˻yguW̕c*;/GAWxʇNWl4Jn5UF5͝ E4G>?v9n;$zG/^rZ z0kEK _I;qIn|z"0Q}hR3݇o r&:q%=T;/BL gLl#)}VKըi[Y-Gs'0/KC.JiuX(v@+1V aOBT"Մ ;ječ`pF*-Os1vzn6pPZf3ޖuf<lJ5̮7,ϣ}ybb9OZh㽺pǏ.jWuw-T$~n-IF2`gdRpID2tmBAEܚP0YؽT@ҊC lKeO|j#Q@-4 6 JWL8e) (@:$D Rv}]I1 cUb׳e|lv,D1my,}5y4bZ#y$]pu9orSKsLݹi ջC@dyLoxf>KǫN.V>;?Ɨs -͂idWW?;}7j }M!YeaBeE$ȕn%*زF]^/8# eb\d[Β>"i[bs8H3LJe(mqP( ^cؒe>.Lo";nnEG?(=S崽M$Q>(>4R1d\m ϣKv#G]y4~6!6g2㸸,1ӳӳӳӳ&L/(gc2+.Ҋ瞗*V,/ΣNCaxUB?)}-o:~E Ut|t2Gۑdv{R UHrW9;A"U+adSsf{YCM^)\v08JF#TLkbRJ S@,@h,68+(@YJ؝JIksO$WgCp<[RzY<=@s5o߷O4Zn {)?}79p?׿23 g\IHS7WoHW\tպD n>}ztb 2IgM׋g''܂TV[}p '}a)}f"{η/y%*{j"mlGK7(o(5꒬b|u|W2RǶ8ifCn׿)ԟwxvd6y}<2`XΠsC |A30t9'{@JMFB'](Õ!ɻXVjvLl*HTbheⷦ=[~1C/կt_L)9٫6.|,2Xڊ5jaU_&6\1)%b38vs|.f/ViZ~'#ㅁN2Ki=*1Pq!+޵q+EЇ’~'vM4=B.neIG7w%k(dj^ C!bzجU඙p!W 1d"B054=!%gM!屷w hx74J(Ftvtӈ1Jz}xN@ 뫛h|L07ƞ]zv0՝ܶS`ވ" @G؆01B0ah^UKhNÇ4Krv vϞm({ G+FyY ,|$=0 %ۚ_H.dgR2[x#~hZ{z O>Oo1|X2O$N֥;_ X*2"wc'< Qz7'GRD%r]\7۟[:pX+]wzW[ҕ>R8QQQő9߁C#t7ek ~Cgc]R%s;ѪrU1%E)`:#ՠsDGRѯzi5rZwf" $66aq%DR iW!ڱC* *[3<_"`fW`WsoU.<j6cѝ/m>k\ΒJ^ٿd!rNtry['(aY:sKP\Aݍً?"Rѧb~3ߚ+ziKt"l?b[ H*ۻ?z-pUgyً RΞ[)`IUyItTUh`]rel`bvS&v Yn+8TXB(#y# zi&xy !8< tUrDd-ԾJk|v|Aj' %$%G8֊$9mEL, ц`J@+TBq Pa>giˑϑ@X#d`F^O+]L4l*yF#Aറ9["b!lDU :yK|--}pe6H#G<}YxAmeD$ǟBaVhOpV#+~5FT{b"x,ߗK D65rpφ8*;M|EœojBAT~*lwr]ѱN׶DMώӾ(pDF0s?LX9l=1jycpOa$_A_sƪk8"N8kB3$dʥS&gJI -Y`sq.[1Y͎HNPl0#.I|8e ')-!I"hD$%4&80k7;I‰!r :}Es x3iFtg{eY &g%Q\Gqu|GqZG%L@sb1%(Lkθ% \KtX ј3`b|VhGJ`9 oC; y-ֹ4S < (㣰:(,/Qq +rKqB;$ BX.q`,F&J 7.2Q`.Q!.93ssZk Cx=;y}IAOiDh ijEFĔD:mh:qTDG.\ vTL v"*PUwKiuKhP 5GtN .Py-30d;ZP=up3@% EF i'F"uA17_.Wrʔ}m2CڇL邪Cff`O'Îϯ?t&#K)b㷑_۵yeO9y#Fkr}5|6jO~9nBuT`gHz`)/)nj?x}߆g0ci{7`sEZ9 ܤ(b"67P0ɠ\vKꈀ-:ǎ2Jm& VSṬi'D}'7ZeB$d=Ԙ2Bx &RQڣMj egwcL @"PpE m:%@RtL:_G+A^M`=WPCAKII68f m GR1I>2͸Mzyā2jNB9?īKN"' b3GxŎ\%s45P2[=$Ƅ("iӑPPVW**x^K X .( )aכc 'MkF) ru^@1̢ݏm* B%ϒiG;0hxR-7̈́}UB0qyԓ@RlRWQ; #MA./֎|;yCRgMo72;vJ݀5 >([#'uoH4B&'8\ Q1hZMbMk46VHbqĔUYT*厓!M^VۜʹoYX"nL.p=_8Y-$ՇjB ϑʘ[LT Y2{וz g5>3˕Ve_Sw^Ό7,GN'na1LƗlWJ q]5EwOfٗ?R#s2ɀ +(Aw{2|ƗcU* k1G*,s}0@۹N}7N+QMC)S5w?l6i+,ܧ@I @ꎤ',m=M-Eا}cHRk$H%Xs-sj'1GT2`EF1Ѽ r7#cDV筽ww{Duӧ*̬ٳzjOM$S9HM2fyG-IK}њhgkfzYi0f~`ڳ`]gm<^ |}Ƀ郢}d}v#ɳN)՞(XO+էW`!Z\񥉍RYZdғ [{f<~\)==Qm@aB:ӏa*.Q\9|\g Cc}84oqCȯ46ު[ioҰ3NnwEVx1ÉʨIrzöp2F`m`P 4a"cX!]7^,f:)>ЯA/v(zoJYg`e貝v>.#|Vŧ=<##(Y߱Sٳ<~ue)e A ^Ӂ#M$e鰋o|Al.᎟?;~ws8"aQD%8N$Żdk`c @焉ud<⛢h˼B懞YR*"^q @Mˆ5\'v?\l[>Hx'D~,= 2g^oOG6NeЌo'aՂYD?nj*ߢ]ӫMamt 0^5O>$U[A҈ynyl]:yIp96ˍ,4nUI26`TudXXk?[ލdLBK*⎬$D42J(b>݀n6JcH䰊ځƏ}Fkʉ2DH <ɥs",PX[N] <\BŜ^8![f$*}vi;{2GޙRnE-AǍMw<=4Gޚc8kzIe2g|k Q/ M`"`d&‰FpL"rdAXkxzk2@=mg?M=㓯xQՋhcƼ#LWYeY3?d1u!V]>9.^ `XkˇixZY_|7߃ WQ`4<~S3_>nR93~|ӽN&_Jqyh4t ?4w1>=5_Y"+ӏw͂hDϬ3;5nJ*4Q/.p#Oϋ.&.~{o?HGx ffFulzn!CuHucN}biʃWfhn L{Z?ٮ`_{w˻QL͍tOoq6iGc[bwA`0ib2=9~?{W6{Ξ_27w&Mnf&dw #Kr˔ v;߯d0c $`lYz~]#{qL^u<ԪR_)[%nuP }r[wOJ_q{{8pS4nƩ?OnQyvtyRkҟ.o7r6o7I< -yZKFYXMb*)~u u ~NlD7iݦ}0ݫӭw+6H;H#1Ahd~޺o8fyj&U Kb xլV9Gs$P%LX[ Osq6="m/N0ސ i3̈UgZ UZUZUZUZ(V-krks d/Iy.P_ -C!Hp51_P\JH^w>-8vs#8jXKXHd8 c>А׬ ]?o98FK+V?2Ã`UM!Ld5CkwWAA6RPa2QjJ۫wb.+B>9 ݪ^VS^Jf_%ℵz((Wxhw!P [Vq`w"I'᩹tJ48D"I5V9u%QL#&'!`Z(TcbT4 ,( fk r_lzV &ǓDS$` Q 4Ga`t " #5g:Kp*RёQkQE0tok3iC4 2&5(=Q#F:Ր/tqlCk,=CQEֹ(rhG׷/˞vm;bmA FH"$aNHD"!/#G,gD=5+ڹ)` Vq V,qA$P7X&Ճ 2 l3՛:[Ig r@"*5UY+k ǁ9y,jz /R^jKmx /6lRd禀ج /;ix!B>}eȞx1`ux+^׹ux-׹?Bx! V׹L`o!"#y[ gyPfR ٛqb&׵TfHVTHɢiE:58B~s/6 d ]*-,"1#T[[B+.P#`QkaIwL>l,@{n|fe0MQMTビflý0{fGV\CzP9%c?`cq<wU>Qo>갏:>jsŰB; a<bڄ"K(S ),.Q#A\U6^iNT$qNz=P?;evT0S;`Xb1 OK 0װ^sA-TrM9Ӏ@4n@keyxPB4gC6 w0 Ajv08q -8oB<z=X ͥ^US0QK_2 7|Zbpq6%E@+# H׈SD)KKEHÜ^s" h Y.rHϰ&n]@ dic$|-y'J*)@ 5 Z%)^lHH3<[`>G<[(yV;jYtoҋKkUY$u,K1[|!\܍Pe͈ ĎM;T-0qx%o#HҕJovc87*4=/_5ݸ[R-.? 0ĩPԨlBEQd#iVC1;n1T)-#,Fi9kpX;S}C8OnT=ANβ+38% Z6!O#D IHP 1;c ,:)٥?x{:񞧎~aog';Y+*"K6?{i9#פI߿I_4?wbN_;}3'V8>x?|_NO-Ԅw|y~rt{~=$!ࢳQ?3V.+K^ W>kQ{ܷ''_=UV7}ǽ7(n g{i/C@LoF]ڃλv$9I޴&eRt Шfkem|6ˀؚyeHnte߂)%YU.w`fHDĐ2_a*G_ˀg ~4eZ;l|B,ZiWgZ6bU4I ̝}qdY{8qWoLWWWۈZ}bId\NUw-mK5[A ,,TfAIxnyTٝfc}kc)-wbefGk&7l~}̈́)Yx36Gs*!Gl{N?@V%,2̴"# 2F[c oK1c]J|j{rc/ȝ@U9\کk>bsԽ֟ D\l"L&5k9$&UT& s" ;}Al..k}GƁ#auu &0J: i#DζҶ8 wuBvޤ&ni>@ܽ?Xj|ie5Zabh4*N V#$QL9FٰuGG>!dX9gVE^А0$ixρ,صEHEBScpe6#&\C#0###$#oqHg#,#-ÄF1!DZ?2@ TNbIXg0KSsD E|jG?Q$"qKR*ƊXin#-Q b ;icǟpvSF =I$K=鸩fIKp2ޤI#&M lu==h8op!r=@8}q}ooK fB@{MJ㠍1%"J~Fזآ<>+C=)p,#E3~vhFޅ?{ƍ_a]4ecm*us*'o "QDNRט!!90>,VTC 1FѭpUfԊ%Hu&it2;:{NuR'(Knj99څ$}( p( Y[ؙD9+P9DpV vT4٨;ƞ_89jUCUƚ-91YsPS5.1Ruz!'FjhUIl m2:DPÒTT%e(Z[$M+1gj`]@ MNduS 3`f~ z<2X2A]{"Q[:k"fzT㉑_'BT:(ky"-5 j b"6Aa(a j`Y|x}%&uA"4V|ɁMՅp0.qxxz)pNک x@n\FX.ftM~܁D'`=|{;xnKk2}YZ+"ݶݖm;~"\RY]J8߼~shu )XbZz t^*9wI;QQa+~^AyE^}F һ(a!ݲPD31Ah{ɿ{:5V=ğwgz珷x2T^j2v:]}wՓW/.v|Q73:B-iCho\;b3OQ+RT9 `6ɵK3 =hhZ +EUԤeT Hث9o&KYܶw3}S / wں a߅8-qXSn ?}Np[Dv\x=Fou⽭fT(I1X6ܼ\/WF:_Z@ޥIݭVU4[$ڢ]}qʣ6h:gWc߾h ۖt}Gy봸 GhrF M)[>Y!,)sen\X eL-ncH]Yjp gN<%[ !H8ah#g%J3ֽ~8䅳<%I߶gpJP noޥo?o-'.\qRuoF&v*$>ގnn?NqiYu欜Qqu^_* _߽\,蘥pݴާȩ)zCR {% nb0TڽӕtL+$BH1) C[ m= 2!l.L+wiۙ\U)*44PDJʺK,!w,^90]o\8!|]񣯿GL+i $$ܝHRu!IBIjY}Z.D1yNzDyfj_2\@e Snp ZrώWO*KT%R{u4a~j6i)'W`<{u8uVm}f05alpzG`:&[b؜=8+Q,ƟAOmL5?^:lV3pkss6vQBPNS_1DO ER㲋;*KJv%gZȁT6L]IT{Guá bC}e}ecA~'f|j<'8+Q1Hw *ޱJgwR9o}Rem* 媱6__<|,1ˑꇕ|~|ٕE 8z꒤g3wިՄ;zSő`g6_/YAdqe;,.1X?OtW2 |.>p:N`]|p_WxUR~KI{X2HuLYu\iۃ/طsOl!0@aPSaY廘",ucT3hIK4V P=nXaI,)%fK/tPjfł.Y} 3L\X8[1gBs]&6 JC)iC̎&2CR`ﱠ/Ԙ9o@mD0YP%&?Q Y#E6bBD M=cGLTl!7&goM,lxǗ@ᒉXQ.D]Log'HWxihIj= T0|/)FJCHFuFZx>}A41} cٌkg' A[ MSspH j嘚hK&/T'R^ 鉢^ECTPWAB*y6 >T5d0XSAi#4C;l crJLTThƠ'Ğ?2$21,&5|þs̹fX)G?oz= l!osψ m1 aHꏕ.߼3!VBXt"Q;eRH!j<rƠ0x)@Ҟ x U)_PhVR*aGF\$+7 >& R \)',C`"hm<' WDu!/$"0a1a6pi֔_雳D9D| 1Z vR>4XS1IB^0Ӆ֒SBoZsK5֒4+հӣU/(wU2jܗK|̗~k{훳G˻ٜWߜoq99~8 &ؐB )$p)%gOvV|x5Z!fT 81<{:{ ; g@ hSvieV$ϔy6sPJʞ ؕҒI0`֑i qpB1 pqKKhIb̀D߆wyijF\Ek1ΎM3XShPoxdzezwm[v4Ԏb3'mE#Sz̉6[Fd ʠ1^s庐3#UZ#ʅ t.FEh;ezs4fLCPҐ 2LDYLu%)H,3}HAXyY$V9TCT΄;)[ RXy}sz7 51e2&.ja2: /.D*B;3AVQ?{OƑ 2; *f$y±c|CR"lhTW'^.tSI6{ !) xĽ;fXwۢC FiשX2L̯>;}ᨱamgK[^ y$"N4Rin6Оq1p$Ra!KG^ZԠ_ &*meJ$U25Q!X0R0)~ȓse%j/+Va"> ڸQl"+бE؜z,bv<>;= F 3ByQ^ 4Z+. ZÍB`.a}_zM {1!hݟܚxf814VV1[ԔٚvSiB!̈́rÓ e7ȧ 8"zzDP *nJEcr2Wn+n]N\ep%Aܤ; GF'T832I(*m Aaa@j4WLmJe{)ѻmQ63g@#A}>kA HʊR@[7Ls)k TC56mCElh׬%ImQ5w>ԎlwY1?? 5 3H494ã}{WcL |n&Jͧv1|{u] OWg`6?\(hdD';$agŇEUٿ=uMɷ!UOȑkjٰ[ m+D.FO-Ia xREIyw[C(.(,=t#(Ta[jRӔ!\^nۆmkE ^OKeY统rfYF~еvH27`Hͧ߉{ug{{:W&,dO^H`ha,`C_sh@( 1́Lՠ%6Xvyo6L*] CV~+w3Wd3 gho$&KЄ@VVRpq.+YoPvyo % ӊG7}J&R}<+ozn;|24ܹ 3 Eum &2,w: #/"ҕzGS;q iW?#iF?▤T_B.K첈7,co~agdP2gJOt2vGTt1 ;?w,sX"Eb*,pa,PnaKxF*XICH%:YDbLE9g vF<,&/XEg˨ƀl~B6dwfzu7~y>w|8"@'YjMaZ !=sMg= 9IrRJZi6Ec"Kr1QP8{Hniᅷ| RP ~wZiM(badcZf @F.*?;4KcκJl,50!5- yJN't5kr k9 F٦]hH*i-<;q Xr{|"!ܡ1XVBGf;Cs:8L[Fpe#F)q;8WO,k7ղv3]W?VyuE:_hgW.۷RnrT[ ;i\T\@'RQ4kW^u=p ӻ)B@5FH/KwHHQPuͯ;P շ]U<*Qj4B'u@YёST!$#B")),sPx\^kİA7Da _1R|5뉹FY͉+ f:Fy?,  xǙxάc66tM VLƽC5.ȁuɂ;NY$ܠtJgPEEt)@%cV"H& Y()wDnIhn^ !UEJ)KĖQrt 7')5-rDi׳odGsygw=x^#+8l=ۅ"IM߱y,-ôᤅâ]tR- Riߩ2Rcx1*\)[␯EG7q ҭ)rX;H^HJzҭyL|,ڈ$;zySJ'J8a#xڛtk htkCr])S2E2A9k\Žƅ5ׄJwIݍ=kcP`Wݺ/SeY4@V?`M(*(r$rf *I8WH+'(BmWaZ(_65:eVI3=/nxTU/FmTa#gfvZI|hِ.|<3`pnbme]֠!̇γ.O ӫ|k[IuC"GFIǽhĔTJګX 9aM=;ҵΛޱRjjMjqQVŖB+5_ ң{_@Coiwfu 2I&O]4+)sW0n󓿅lvsvzJM/=ExR ffpg(S49Jrxŵr]~_wx(?Mw;lۊHrR + bȄMᝓ1tHv~dw#eN ޴_K3sΘ&ُb'}/'Nכ4p=M,3Ĉ>aawx8φidԮޚi3Yf\ T YfYUMHJ,wyM"+i J}2 ,๙O&!W"B_Cǽ$I$`NE_lBjD}k𣃟[PUJnf׭m1 8\Q V> =/ `(\\$CAMWw={=xR!HO3C^15JCǽ_597,@g29cUp.B 09\0QhB1m$ /9aӅAZZ$ۜd]NvjTH#;bG'{li}Bx%zhCn,!Q3Kb1nooJ뜓q4f&Y$gO(sಈ{1f6Rna2%=݇rU*R W >x^RM Ty͛|GN)cuM{0LhӼRw葍wDHDHDHD:'61@4{ 4S4F} І ellF#|oT#poHle*'ڕ$u cy)2 A P x WE@ gR%eZɖB觵q-#7!ej0AH` )hydbe. M!jNg# qOo?!P T4EY)#}HJU+]Q`- -n[j%ww8G) w'2űXNAQ>a|p 7'@@ Vc%goNP̠uy9pwhŒ(NQ^LtRr^(5__O2 )%zI+\+mD`FD{$RJEݺ('$ \Wad`<^&ΜMSm_02ϥncv2+ OҊ\~qt?ïMIv4@MC,h/@MLj[(>X1)c*d" 󫴡t{([r&S-r/Se>SJ9MHevʻ)Q޵P@)}ʻ1*OXGRq}QL?KTgNj-Iކ@4TZRvsr3SaJy P@@b z'*7<뀥 =qq9Ye^LY$@.%x™_A'M[Q63ڼj=heār:ç|BJ dW=Q&FLN +lƀ.@+XƬ* o3RxD0PRuq ADObsԏ[3! 1f _+(~%?nG\Qo.FLSiJe-#I|zjr [TmaO"0sh[~edzn !3*(rKAITPD+krSyTPP! (qE.U͹3U0KϠYZ䈦Z;BB#UD:F ?]}n' 6>>u5@=6Cm^foë4<(pz< ǯ1hr#3p :6M8JvC}|wlo+,ڳT%ҟ^DN_sl_4"jcY0sk?:ᚍ0 r_*woy;h?D=kťx6*.ڂnߜ\/nΟw{.20t$ȯ_W_[ּdػ= n:qjݒ=hP1dOђ]BQ.߹ /p?UM5#nGpIF)yTJun٣"pr ٟr(x#J!DQ{(Āx? -oe!E tUtGԱ.ga2*N/_A&z@ԮC7 ZO@,֜Gjk,{ xs_ ^qIUSF@䬮Ȟ1ݔ1 N3&˅1Wѷß6pF?Vϊ?v~3'!Ui\ͼ&q6 m'$$7t FSFݲm::pé$RRpKFK%gsbchM6;{;{wwv7Ue?6la9g FVتm70SFn;PTSwv씁2lw][{/ЧZhx!+)ʖ*,m4J6,ʒ5@*v;U[*)|m5Sm}sm)JayΛM^ a%0l6,C[ṰT2vvGi5ڮ'.I+-+KlXLBչ- y[}?Wl?fxo6˫岼?̜y<:;O#>0QMrF{b{i|Uv^՗r4?!<]2.8\IW;0pʌF>;$daMcֱ|o|Eވ6u2[m91ܺEcE?Oz25o AGj WSe]b]-VDFTF65x+Pu 8V]*א3-I6cN5gk D d /ZS@taU6T"ѝwfmC0p4NUUa7ËVBGʍR: WXp@N$3w(18:%ˇRr JCxKXc}EHLyNa$Wfosb:eь.{1^rHg1`McɃ*DeyN>F9U`?4`_[g卌B"`h\:TN{<¨7r#xU9;f@I͙ Lub$]:c" ^NgIƼG# h+k'uVKR5Eg P7cGDzs~Jᜥq ЍsUu;rP.udq8!b4"*,ôgF{ywvUaƍ׀Mn1(NG][h\gnG 2p!dbtҚ{hA\pPkǛz+j`ڒȊQin0dz}&qj K) 's*Rk)Ba6(H; L/tq{NswwV*" < L2!>ۭ`buYETk1~:h.1dSY 4&[v@/ji kTOfK)e'fr2W+L!5y(oCtBU)r'G#[z +uT% c AJmȞ16;Ly-;lR=hk/c>hxYcyh;,̠#\>,`[!?:fϦt^:/ocA\9gKۨ{i?|`톛_, A$U#UT#$LFv#PI ʬi* M#5ʕ BA$*]6#C,( r(u{DL-v#݇Ǜ2<͐wۡ D&Mu\h[_.''PNܖ&+Sd&ضLJ>;A)t`a8NPB” 3Ɔ>c Gͫ9%%+w\ӫj Eb`TH dܩ J((>]=&t_=-zdAsfҚPwPTRiA[k/7}1c8g'\./ VQ]}mw Wj@5 {ܡgG(jIΦO/!mOgn\uMԺ䬾[ىΓ_6k.Ɲ7b~3k;_7yC~Ֆ|pqVt~_IScպںwֹX/fwi+^p HqfOL?'Nf *O5(<onHJg==X\/~9yZMU_haDi3&ܽ;YD?w+s0*>b$Eܐ/\l+:}mLJ@NGd5w=+< = it4Bn;Dʲi]RVzg1o;Z 2){M1Ҵ1Rpݒڳ+@@"5~mΛZo]qqRF7M-?B!s.a\g-NY' (F'5u"KEEӔוd{]q1# g8_7Hy`mcU̟`+lY8"mCPԊ׌SɊmn=q U@:3ZrM.:9-S\=C\58Ge nOe@4ݣQNyJdi4rDWzVW7&Z(AZng1~W6!hNiC^h#9\Z&CSߕȗ!u,zrX OP U38?o){[{#Y`d{S}=ȁmƂLR7\i1-/7Ғ*3sL'yLh.v#Uc#"k4q_k==<ѥ%%AP ӥvz{\ ]Ec w:"YN9{Ƙ.2WTx'l"NHR n@MɌ Pz5mfǧfĖ\7Nr KPJmu[}Mͼ@u8o5jyN\]5<(YN%¶E[J8̬ozR+K_jE2Gv3oJS*ڪFJxئJq]Uݕ,u"vI#үצ18'2hf}p$]:B/Wn J6`+x18R|^-adR2ШR`L^^- vzp|h0Lc5mQY#0CӞ?ɍĩ5ͼ`}[ʪ;CVu, RSY@^=YsIB|`K^u^JԹxAQ-X]ukL/hT({Z0Bt+UKĊ5vXv}ҵu貿 *5jL[0܄% KNO8 %4 N:nZrJ>~Pez6fbo)7 KAE"5 0CAC{v:ZϽ$LUL&^D0l&4RM}d \d26V,<! ILfģ|NPl%|KR*_:J(dM<وՏ=Ďs&wǙ?Vƫ%(fy1d@G}3pwtA!#sLfvU_nS˃r: V'_ʙE4=\t~ .E.wK9 ܨ7̜׈yɸ3RN!ԱHh3 !*Z#k9RpE5^ 49e:gg&e˝uZN隂^𭐏N,%i8ڛz3|MiCS~2QŞJJp\:pAX0j[!)`\kM52>j#0 u_x-2ltmp5f˃B(z=+l8kpRr.yɉZcb+Z&@z s!H84x z;4Q*QqzaxpToɺ:N6q44rjM uU`gW!r)VmmTA6uëK aZ_c}nJvSݔR&~^wJ v d.WaD4Ja rf`}xaEΊR RX Ia(ᜓIk8d9:%4`',eȨ*W[pJ NFbI 4x'QQh|`),/E@෺1^\*ҙ_uf@AG1fK/EQdՁG%HR+2r( I@@5i2n}rӐI]Y _ #=5`Djˀ+ 0hV2 B.R5"ru3X|ƤAdԓ XSJ1h눧60ARsJ-A5XF3e@RLh箸aиa 0Y3E"uaPؗ)GPbRCZ;2Ւ8-܆J-Q,#7 r^e 43 ,* *F\S,iO` }KLP6Y؜ߤ=Nvb,|DN;V0bi`nd͝I`|<f. -1]iDZHi[kҍo9y+@A͝sw{,1L}&GuaZFQ^ (b7G/w+ؾ \7biwP>.%&TSmG] 2.HW EBSJkWL+ނ"%D NM |f_âXhV[vpEUҺJ+;x pynj\MkqmظTZhi Rp: [%Qm`#eti;P-;jP!Dh#Kx2Re1=:bZPQ$ }%F2u.wM$뤌Kj(e6&[ 2&zLʝS(5VL$RLT(fۋ V0G `/oS`Wdk Iw $JP"1e/cjaWPL& [KEb 6Ō13C1Z|M]"EAo)G( &p9 ap-jLJKU6sbY=g)ce[7Z[~,]D#N{GPH]+ J|Y5#ڈ`:W krWEYDoToMz)3.﮴J/e:a.T# Z~A,R;+ nFo%3g9̙,ed̙RȭkRi6,UEiS {Ta]>bUL< 3cJ;:J=aFkMyVUICo X4B)o70n ?/N;7{G7̓rz)h1>5.̖d/yN},t>p:9Nvo;mO&/0%J'mwZg;m-I37ot ̘ʆ1C}ek%zn/oٯi#h܆HxoNMwe11ާ-'wI|v׾y>0sʴ4uhn>gd ́ȝvsNlO3(@5@-&0B> b "yT'ƙ]"Νc7eCg!(\ 81KAac3^yH ň6e]Zbsw.]jjVkO}͒6h=fMTS(|D:.MpoqqiҋsI.s̀a(*G 0QDl2"62^!TdWK=rR/mka]_#C.CJ9zcr,+h /ne5.R3$) c,#-[> Vž@)yK֠GFoT{O=N|OZRƷ6J!\t"*k@tZ收qGOʅ*RLYifZpyV|af洗)`A{g2=39Z|n_,pW'6j6eNNƽ/bs 4_?gos2/Lý7aee[r6c&?r꜡V.?Ǔց;ˇg]GϏ~9>:kGGC_,Uh~8Faxo_rIy]F_qru_){=ǿ*w?SNFbx=*vX=rwzdt"tvn1~@F)>8l~HOë7|V=p\r#89Eeoɋ_cU)UBS:vuOG5ވW3i!oZxXu7T6F /zs+߇ɇQ!J[)և/o3oAaTuTQp(R-U{Srת4\I;V LB_X+'M-[yo@T$Zae{"[,e{S3Y޵25i;1VK+FwnKf3X6k*m&DS7/7Fl[@cwlwf)uǫo3SY:/oytW FVqKݗV;]̰KŽ=+KQ (#jp|zx>؟(ű7o2&t2FGƻ2^R@tQethG@g شz%Ŧb]*&CDbZLb{sH\ta&)M5)2f륏PHEU3ulVlECju# jl˵NAyׄW`uIVX8;v)tZМ*A JBXu#giѩT hݛÓJ@*U8p(}޴ؠsWSNkPn<m*V@jVd?$-{v" v$,\Z-_%Mp=(=zI²C#y: H 1J4%NG9KT 0Q+ J L2ǽ'}ݍE}_֮B9Exsd*H/|m?F`v* E»uH@Nеˈ,FjTa,=L_ɮ}4E6CΫF8Ǘ{Cbg&2zktIyM ǿ3Pǧ<}ڿ -D >w:˖|ֻ~?7?7:d'}/GSN3%^ tJQ|?&MmG2@NobsN7|pMO@жmԈEnN|"'6*lVDVQf$Lx4JUEVJB$<a6jg)e;lgv4_ZSvjN-۩e;+۹N,hGؼp߽f&hUl`h0YxaiBHu}u;SFԲC >KqΑRVYCBm2U}NJxeZ'*8cD.@'y 6n-`rŰN՚s6Ohb:C`,f+uJ Ȋ1P /\@))0i!J_^x*W%?|x+/'-J4g!c9_vi]J5 zmWs9JXO ٪h1c=]2(~0_bϮ@y4*ė4Ep0m)_%a݃ڮ5=5B=IKw^dLKAl2ۘ,BV\Fdj#E卦A]և 9"X>2XVpjڢ-v)mದ-jڢ-jڢ-oP: ͆ӫμqutbd9 3c ۍ$Ur Y;TKsVS٧OeűwҽyN&C,uf;ɧ8M)5( .pP%MY$䣁?L--L-?8ځr]6M'ܮ,՚*EWUycMbpAc&}ɵ ;D!mɚA#Uδ&kqŒ2Pg׀ٲ}a-?HzK hɊLPx"8XE2&I҃Hp[J:a+:>%n.%2m+V]Eۊm+V]FCsG xkU1c6U2$vVF7F e}<mz -?Цj$A/ W|$Ne#Q]js 4WXu;<6zZޞpZx5|_~2쿻|_}ǾA:W%jdfj^=i``q]`H>``?[,Eٳh?"TO(|_Qz1eO1@-Z^\>`i"G9L1gbd/ l6'ceZRq,&O<] yC* lw lI l+V]NZȆWG:6 8EcC*&!T>Avv6oYU*Rl7pڢ^*oS&\=o $z0HqXŷ "V(mpN.I;OhT$K$Zݯ$TIP%JB# e 36:bx&MȢ#PqLl1AjeSG?+Hsz33"/ ؚ1J=3!OْeylVSulݝ]2⽨ȬIS^ I@KǂL|0<.ӫFj| Rׯ@8ߖ:XveTm$FIKv29fcJ!YU/]Fު}iU?ݼj̆(!~]..x/^݂h>.ni;.e;b9%bXAdr/y?Sy`ĿCf{s @`;\򸬚[>ZgOyzy{m>:xZoVT7_w`y~m,L*waai7߿*bI.7?v)fqBr`ZW󣳎T#œBg_ oEL Ȧ,]Jp1Ypſ5A%_T+^_hzs`Pƫ7Hr崕;rK+( g ]z7{I6hrwx}#!y*?lOY'ίOsx8Ʀ65pߧ)?~) ptzkmpʷɲAT9r?|v:j&gcM1rEe>9h] D|!W3$SV&(UW ^Y7! %cvS5;^`k t*ハvyvyy(å||%`۽rI5;6QcHxͅ|f)""BJBK\TQR!_X[8+Q3J:iUo=j&zMxlqt5IVmX3ٺBˎ'x([-d'U1f#O.T)K/ b{Z0]=$ex 7 X1]ZtVk9+ 0bh9s CXq$OBGie\- b嘵Ĭ$1{Xdrǔ0Xu+ħL+$E-:$ Yxy vA΋] !.e,cQ;: pA%5u9^-!{2hA˵ ^`rg idF(|UC!(ڀ;o]Qj6iPƚi.|:u!".NT̮!Db%UKDjYcAMZvQ; u4 >w; c=wǛT+`΅\*>j2JU]ՔjvHyLI^V:B6RZB!By~uNq v>Vxwf @rJ7ϡtxCǂ \΃֣ L(6L{X$+ zod o MùYdHXOOH ;rGgGaWi̦ I҉YS ;D^ektiThM$Q pG-]kp.(o F,dJq``S=(iNJd݅nT0yedv#Gg@IV׎ONN6dt 2tǐ\K:Dlf%FXIcM LU"ZiOQN`9Y=֩XD%u5Ͽl$'y{Uy'T<͠o0DT P{.oq"k)wBL >I)V+قtzt{D|!I֒TJVyR6=,V$ڔ&G6wzr3ItHʪ=l )0`P(׶"XCRR`Jὀh obf@2F3]=$XD|\/ê_quW-!5F?հugջwl71_[륂aן]~ O8g}Yh/MWF'dwvhu-ͧzt~B˻}K 5O/NUpKp.qϊuX9t8ޜbfXxmxj)-y,'.[{ ʽj(thNhN*+䖭lI1.Rh7P%X:at;ԏ%.Axك HL1.X77 DC@~P9L^joe3M)åW^KK60Ĩ,N;q`LK F6\ƽ 5kt`z0\Ay[=h޼M*>s3gc h큃'hgH"`lbUb* `s;D}d=h((u›~H/ƤeCڤ BlT,FG $g)MI8AV샰2%+u? F&e0oKjc`r\No2FxX^M4Iaiۣ3TPކJFeHsTZ/=]3r *s}uH ITQ5 `̀;57IfR{vh[ؽj&~])i&UBBIV# k>Tk&DhL^{oP L2)bϤ>XJA!Bv\4iTK^cT_C_:npm\B*g@m"y| Bъƻ}#^$K>OuFG Vɸ-4CXf$o#3>g8ϟx ty׺(;^h`2$>2NqOp\_5_),RFR, uDJPj( Jf__lZF‚}nBaZZ71w]zyJ1Q;ئڵS,;ʪ aP$ϺVQ)Q':@7"Rh';GxWO7IgU2Q߶ :-©vHQZK_F32NVcyw~9&|x}r MA,%3RT+TɔEsuRۂJ["o)NV EL$cZƹhMq ƏYAOZƓ𬍕3rYXV"J 6 ;|k (g΂ r=l&{]f',ۣ98vko#25Tug<9&y?(ܴb: +C1,lQEd &-~,(T37@,\)ޞn]$3&um-wυH*+Bڶ3@x5By`"A< Mze2A#R⠀T۩  8HJO{8KlgXi7'CʶIƔ}+BΈD)tt) WMQԢ;G$kD>VHQ5|-$7 9!+EdfYf%(}$lIwSĪǖ:kHk;5N{TBXnBO,ґ#c7p8Zi֪9&UVx)}"3~TAf b{'NTiY .am߭=^yy .$ыK57eÂPQvoߠJ&SKTVmoZc v{DksmzFIL/?f"U81dn'e/65Yoؼro/6~=F Ihn%b"[H*p+P*i[_u_VcPImR61lthY״hhT^{oжktX-TNӹ}$ek&l}m'kNa{;c:uo$#Lj~_UÐyj}U#yݾAb*T= QIH"Էj6=1RH!<䄽_ m7C<>߸oP%M,΍)OwM@cj4c }3dMQxZ(z][oɱ+^N~1J : xy:tO͵$$% ?դ,x{8Ҧf5DI6(Ä09 u7KKZ|dƇjؐy$F_ԟV"S1dD}^$XeF5-5;1i{:A4;J6dZ7$-),$VH7ZHFy2 0_CB㒡yb驪20Drb2py[#%ovsrc2|2eMvy`6G$R,<(,)WPms !\t\\";MD&F6F,cˢXA*(w-#qŶXUgn1D*{2FExB!VDm(?N f@12,@(9 nz(aT -U4qs/8#-n5ž1w%&+LlȊ2,='%e4PGL@6yɑYpwC5pN2=LKHg#ڠFQEiJHJk٧B0G]"siB:_qBG]B ڴ *ܧv;X2n s9ÛeFESkTa0 kcrYx _,ؚdHE.g#w7>bO%GO0k'՗kT/{GVp1ΐhmqx JgC 9mkUn)0 oՁJy*yPFPi%>y6ru8'T<% L2wRy 2x8 @s`(cֿ\tNStqD+ -beJe`! Ԧ IЄ`_KBYm=W0`ߗ2}nj k\*ΒDbeo;Y!;uhe/Wkh{2ȑj%ΦOa|":-՗TU~ vm  A8T܂/.`_:-(vn~&B@ 1H2Sy?|t {3̌~X^c|͆7n m[e0ue+XQlkgA z?T̀ɝw*r?.tt2ȧV_L>;?$xth*djeCd>\"eh,쭳S7G>dz=P\[/zGQ )Ld`Gf?|ZD" 6oVI@95ٽBF[*kh}҇RA݂S28-@@wͿ(+z> ~_ݸN<0~W/-?,08YY|?}sok[7w3ߕI-cX'~at8a0_R u2?/_xB^Z<.%(u1K%& EJrppNaKcV4(7Sm6ɋ0ݸ.,uN6v~z\x81J"d<>ۻ_K+ D@fA`Q?77|6^۷s]ᦴd+Hһ2:t:>*!Rp{.SK÷nn2݁w)~t_p{ͤus'Ɇ7a\|bI_ 1.g䈓jt\!SqUyY6~ǎG~C,M{NWh~zdĸ Iw;vؕcWގ]t4s'-Ν&2#dr!yPn|UJ(+ J|_aO ޿BjGoRyNT}ѫEӀ!1LTnm3_s0 /]tՌ/[+hV"-~8p+\dψĖJ:ʭ!Sr6 g4F1u*LZ\EW#c>Fi}6/!~YH|U [$ǟl{6`y~TШAE[c dJI~wqe~@ 2%5P('rod^F&nӏ]}״8G\5͜IQⴌ&` uΈI)/3VUB;Mg<*,֍dbShb5 Uf቎ lyGa;hK%\*ZX/N #F")mЁ5ÔNߤ:Pr H <'~†KB UK(׼ɮ s *鸁Z'\FnQ hߟtXѵGj0H|8alnGZ+LU}~B2J7Jrzլ)' ︁ 34#8(<ɕhFAk+ʵͪ^񧑫@)ma:8AغVH9/~֚CK-HlOz юf'L5_['xz1U-xIFƤ#718FM4 iuXQNG^q4#>i)(:ƒ*1 gO!=%w3KѴx||%iOdtw N3 ӓ{Xpf]bAMYS;$۟BzM]lxro1AE%N0*+&F(apT&LR<O338VdžKE|}rE ԔL.T~cNVN;;bb+=|CAĖwt(&t;:PNSB$cu_׷@byrFW2JY"Q*~w}5xB ~dvyoϗNR3Ϳ n͇FGU.p8qscrgzS$~<1lɀό ~>_tYy^~UEpQ)xjr" '597J:B2-EzAgn>}E\6Pi)IX3k|'X~`L@u`DR15/oW$#Syx<)jO$9f$m!cj9#h<`=A,&?Vh_xs`EF1#p"9zpzqLy^i$8)usגG5|0sZEV8jL98|c`VI6\Fe{s[RG\y=S%2e(&P #\9U @c&:ƅ3q.LaIqǟ(8c+<٣S( ­>抛L(Z91"UstDbEcstGU%ij V"` \H Q^׹nap$#f2-9GW⭙=;(@tdiԘ%"4lז%EsMYhVh?^`ϤBdwmoȫMq'\Sa%7Qp"iSgS#$%GIR#eIMW9N~$NVIT{KtPoڟ`6Z(ႪDuii |t8w~3̌.m5ǻl8x0tp lg H񶨮^eqT)qۨP?|LdO7>gD wJgw.SvA\]ϓ;r%r\*$nBݥ cڂBHKYz1HY) ei'kd= KWv>*+ Q JfNE 'aIJD2k`%a"Yp4+AhkCp*u<Ѡa>C nvjGD"ݽoN8 s\:$kDf$ʝ̥ Vj\QmO [( N]Y^D{6iqOt?{{kft9 Z'l?gGZ+L@Df)W`P*gI鹂oKB$$[\ZJ={on" @ ±b %1iõhrP+'&#YXFf j*"qխ8Ð\F)5BXtd |l8Ǎ gdmdۦl4Y>YZnqx\nAն諞n?NG ]ЋWI0^~ 4Ho&_1:#Έ3ѫt4U Ad;<3 J,ot6]5wyiǨDFA̟f# (8O4q#-l/o}cү`E]O^4lx(>{ѓpwST+&w۳=.l]V%ǨXnrUCư.rŀr纝oo;Ɍ̎+MJʂ}3A9` 6ğ.%D@|Mx !̆їk=|/6ꇵ?)aY0`Բ0P;/s iCv)+dUA+e,8jmm VӰf p2̿ḟhCb=aa|ipjӇk@>Tnkj1(RoGSLx$28X{,R7٠+Ƹ;Pس"^C"cC ,f{Pi.aR?:+Wp'dU,k4U ތ7KoOҴPFKh{KrQ%Q!KbGPQh${)e!DŽnXTřA%qpMxq͐8sQkq!Xt5\+c2祀v.5-٬/S_ֵjc dS2YL*YlZ2IeTk$"oD:Zyϱ+X`0 [ai c8$T^GU1ð YQ%GcX@gW$6X q-n2 LbzͻmaHO6?*,ῩK۞uk]+oƝ jpP :HI" c*1%pѥm]'TuDm :enˇ!ĸczKb3NRy]u夃LuY`ʷVҢ*NZVc+\[Bd 9djBag3[ڛKLE`Nk,W`E Vz47@S!E(t0^ nP{-"̺]0 D^Er;iea DՖx FɧT >>*<ӳ[n8b&rs0Ni#ro/"8?1/`f:|AWG%Mh(5Yz"I#4G@݃#!Y/)-@l13ng5{4s͝gG|x/J9XT LzeVdr/4qc 7~QKY?=W E% W}N SӾŒ[,4:fv1\̿ aT-^ VCt\ہ~z7$ld!"_@d\`ǿit0% C!UL~瞖G>w_-_4uw"5wڒjڃ[0j -mEI9(s(H#yWw¹n^8gq_UZQ|LJi!&vLЅ̈Bߍk}v❟*>fae'/"B[ΧOQΥbgP@Yeds9|3133zM#~߂6^OK~M0pɄ,q\TQ >{o@_Pͬ(1poҭv?Ӟ6`xH|rB,fyCFg0}k4GvaqDf=䊪~%JwC35Y9,i⫠4fuC+杄PN[ЍWwq.1S ޻3A |dXi !o `i߱zx$i @x 6?4߆Q3̻,fAM﫭@qc9竄/q, yO+:BemA=6s[{EԹx綠=v7ruYWs>CQa3ЈO}5GV f,4 HS23c=uWf.}28T@40hǬXd-&8 2K~Qk.HNI&:>aDŰ'v1t!u oJí97}fD^WCQafUD/sۄcciP/I[O]V,Hʤ}Z4 T0`D$2s5nz E8C0ف\r;*2QUBp]4*[z yA/s彖n.8yN+5 {}UI{=3Yqm 2qE[7R ~^޸/D%ŷ?o_?>_?Yr=)r5cpG}uu#$>`ec5?S-a҅~U~}dhxkV\pPsM(kZ ?L,C_P>4p@.:é]\Mܠ+N+&D,.3h8+3<;^A-"$\%8yu*)`SB?H(M}ďԫl8 C(CJ4!IG+ƦP×zjN=t&l!@KeDfgiX ې5ΙYa%$4W_MQgv:~w,;6~WOzF);ܩx2n w(q`-bs!zd9WDiQdb0 l~$n<!͡?Ƭgגg')Lڽ!N`r ע󷱦٪"aY LG]K4ƗcS A?[0{ |Ջgymz@/aVPy߶ɚ5^a+3Jsc\UU)72-t"/+!#,c I!ٻmmW47rM>w&m2YzMɀXLj(9q=dZ(k0M q,gpi U՘Wjp^cJlι\Untּ>*Q\V_]t){(T]H8}GH`/{Yw/i8Yp3b>KUZyC SnDT{@]uB66q[}t[ i(Ÿ56y c!B$#aÜG.{T/7PId^nGo`ӲG;RXbcDyؤf$hB@ڄq vV)c8@z6JPfOwiwJy!})!A ʞ8?A|64yݪ?S^` I?5+#f^$'}v%sʾ~ w*XHcx(66A/s G˂mۿb[SJ~s^o?d+y7R MN+b+՚nA+y2oZl&ٞ +@R'j<_.YiQ̈ /`3ӳ>2=4ȪK ^O(v|Blvhi`7z&$CQÁLZ Щ!5=1 zLԅ]0ڽ-Zy!~4$ვ|~tY+-;h폺}? 퓉W lOlU twfOK$S0{?oϺ|-o~ߝfM몞^mbķ'Cݏ?zor&W-$ 5;Z>}QL5pyw 73׾Wq{Zn{t`|:?*[kw~":fϻ]gH2HsEOłucȴ"s? R 5db7Bi:G$o#i%&aD#)(łZ*AVg Z0Ƈq)/!KB&9q|^h8rB%W?JQQQQDC(}(}+}1b%$u:a!UR'nT'UuW '5{< ۊOOwR?Vmă6 ˙?@0}EЈј?Gch1] ɯk^_f:c]R !3.Tx}L{yK f2Hp-eBYH*^ x<˜,ϥWː7ń𡬾oKJf J'eswk,)󧯞5]OnW(7G|Zpa,R3;_lVTIy^O+3cͺP޳BTuI~jTtC5,|ԨJJ۳?x]άGiE*mZ&]:]W*/a㧪M1# N*Oȉ\%p  ,"t1&4(|mp '+h FΣ+j+@ڠ~P7~?pnQ? ~`;:ovȥD2& kV$ ICx@oa\h~[ ?YJXAk)CjLB0AXsk|nwBx}m*:<2 ҅d%SYj!+ =͜qi,<'D&]wE<'3J"Sq}Ey Gh([kn-!y`MM];UVLPXũMa'R/NYHኑ/ޯ5ʊc'᭹3{BcW!L/ T6t#L)L3ri(bY4QV2ˋ`4I&@XIYeqAgbV9OXB _$9W/ DɨO%SVPs* ф#״-$s1L9r~N^9hiq(|b}=AoL a>[qoENޟ~Q崢MEC z7ЇA0=uGp 3;LPwi% Ng ]MfofX Z V`-Zvd5Zrxm^9Z ulq;A^8HNmnD:4+Cֵcwu ý { cШ -Y2zk:&+^y7_޽֫",LCwv[㿴]a^k,$B(2&Md48?8cS}ngGqN-L=嚘F]_z,8:CecA[ѲƇ/f%#Bpje"b5APԆPHtz2KI0 f /[JV4GQ ~6Ap$*2O%EZQ?; &tz yjJWi ʐh J0 bҟ;_ςp&5"#` d8Fz嵓e\j$(`1uR$Zizp<"n?F,0cLyM#G\`JҠ 4eJWP5Qqyx@,Zs[jIus ֣>kV*Ĝ=v9`)10-Xkʅ 0&B"V@&+'Wʘ SV&K>{?B˝8bPü,<)r2TrqH|ț//9 k 319>Jx˳/2߶wuQY!"A8|2kXmvwoAWnф;_ _G?{v2[yz_?5sBǏKT;\F-];ݷ/Nh|q`.XW'Ͼ_g^yl8|Y\>lgWr[g~WqEg__L *8Yc}7_LixN^~/Ƿ=N܍ ճ78pN0Ο⋩ܯP*VO!;^ٺ0V*v=t.~7ҨjaGhuLT9Z tsB&͹PHaȽV*,o~PDz+,}9$򓰴Ǯ\c >畑Jc+s6;Z@^n>_ÈlkH"+y; `Kڷٻ >6=}6<vw}c#. W^~xl(ӆ@]FARGMg,]i;҆تeQ7!7;WVCI2|xCJ_P κ Cv`ya6#)IlA4| .F~xq2 g37&'|~]??i,O\[gZG;=KfN@?Ǒ"q=14"}Aywk$֮@+POHp!Lo˟RPyBokL ?蓸ꗫTKBAY-m/[ $)y` /US'2vsV!ꈳCyńo~%oiƒ@Y9&άOZH`N8]B[$Evp.%-DpRSݔ󑩤ڤ{&@q:9,$S;=)EUGCqy\j3\_\?^-ւ[ye{-5"kAM.veV[߱έm,sL9BKL[XD{>FG U, V`Ny9U!lBJ!E0va膶Hh0"n$Ή"ef`$1igApɄ X?gѰqZ3eP3IrA wћƶuE]~ZC7u_dYE~(&;+vi1SJ 8SD0 Iuh5)u$kyJ@H'1}niRS˄-2@1Ĉ^o@eY"hlmJFZURpՑlh2|!Ww!NB6XGrL,laa- aZhX.h 4sg %QKs4^C!QD{*sU Q+$InjsZU6RGug<1-Vm<}!!-22Dحu b@Ap T%J@yz1~I'(Өyˮ ҙA$jwZ.W?OF lxa'HӬцF''OB%GZ srXc26(L YIi1)ete`b`.ӅFRApei#51Es{O2GӁ PeYZiT5p fN,FPg{zZ424xk#G!`ZmS!TIZG#XB 4J5xzFdR{iVpd3- Q)iG^.FL&yZղW" /8RF6?)XoXVRQN֏Ue [8jA6N+rK2Suv~n6SU=G\yr~6Oy>_lu>JMZHvzexkI[=35lF)ϸEiJVYϔq?U5GXumBS6S˴蟊k+k{q8:I<xPQ\]c텲&M;uNZ=s{]W~w^ht}E o Qz8JM0Vb& 뜗pUN^R!^169١{Ui4Yј$Uz:Lbw2戋]Ci)B4TsS (6D4ia1m2(Rd(OYg0v!9'mXreav㷬*Bm('vo84B@).1nd(W0S^ReFT>Y?U%'cАRQ X': eHSC jJwk[AT51z_6ڬjdhU CCE2P=@(K]n` _tRBTOӋ|$jk3-`d:(}ejGO)^a0*7XE?E>$H#@x'B$/M2 ֔vE@̻f7l #Fh~k>NltjlDQֈɕhk׮EXDǦȣIs&3Ю*ìTC .LΞ%#O^R]MUA\Ә'Κ dmN:b6;̅T5íuUsm)D/mT\9<^-T1ƌu{;5G 6 6m69ҖS"~@&C-<@ &zExmA%m^B^F]آ_\Vvf0zU:OuepsHG=1]FV_~K,FH]C+-؍7~ٲ @$eړ ڐx;ˣMB5 މ/27JN_Um.`+<"^_|k[BwyeoOa) B DDqfX]o6U$2wȋIQ4f.L߱Lf0UQ$F7heC3ltloE\# ZrҔI6 3A[i@ mw k(e@hQ(Cp1|ȎŭD% -m( _@0^E<7+%x`K_kKQv۫t6| J kU9\qRh}Y?Edp<*Ɨӿ=$IOf/){)u v)_><m|7D?PvXn3ri?r8n缑s{6>9U>v8`p4-TwU2),r:1 }o`٬N]| |:MZlg&E@Swup|an8~o_HdzGKC1wfsY!4Ѥ \d4Ҕ‘^܎޵#E^|? ".쇛&ٶnd'e_%K7[dk&KdWbXWj8UC\@&2?8i {iɂP Tj%B MD''.QUxuJTfYB7]aS-֭~8#5SJm *QεDTj,bVڄiC#8M2ERHɼ3 V@Lc[wjX@ӄP\a(7#9m¶' )4GrNr &7d!_Ӗ.)\L3TV-KZYf4r_v.J/>v/k>܅B T~Yf#g&u$R}u0anEQZF, AxA;7v `Nj󻾈8g2B$I0NQ"&S,dKWt11]/%0J{.PfK [ߏC;JNZObRϊ5ǒ^lA}K jZ{(YqTI?x *_TҫfNka{rF /jwLpq05GJ4eqhN%Rh-8'T{U]6df6~X!wJ9wE[6v\񁉶j^\HJ W }bU_a\jԧz+w^=(?jO\h^y1vC٥cK|MF ׾{1$x "Aڒ2?_9~4 zY noح|C7 {w߹t{d7_ Y^Z*Cqgҙ_d(Ss/<|uʮZVMPs@Aܓ|_) RAT4Q k$ZXBkRԸ+B UagoFZ.ď=~zvʓrj3Ls`nc7uX3}=?D Qҿf7SVRH[8eȅD'qf5۷)ǭYGP7 qafjv3|n=?=׊4l78|&Q㓴WktSx3fg$<[V˾VV<:W?:ֈht<ӧgۈ28%,/T`$e\X=;˨9✝JqhVX̒Wٙagݾh!*pԍFjQpTrl$8%LFv-׎m=op29oh[pqZ!Sq\8i`f B}.ofbg/>| )7u4A &O|e~΍ Q!76Fsf-Qo!$"r3C)LNQ+жҦYs+S 3x: \xzTΧ<;!:4'.d 3Iggr ~4ʝ,ƒ*BX}Q;rbTL1 QaEn8$!c0#Sʨ!5xΦaR +U== uF45JEMK3%!yh0MH9u΀Rp\8B ƥYH#D8A2]$ƠJY K/Ǟ*/4Rx yLc (LW~s(@5גIAoK}YE%/g8`ʸNkɸ76frXSt&ɸ9%JI%kMޚI[ Q*~'w4lYޣ'"LHUJ1b920(il jpBˌ"8KXltQVFWBK½UYL.< KHr2 Ceև4ԧƂ B2ҚrK4kfY\se?DeW|9A62SBRi)BiЏ##DPL biMe oZEQ`+T=|("$^|>=ކkd̜((="4 Wt+ᜉṅ79U0lϯ &ނ>yx|sA$'Žp^'3;6nwf6qQI.|hmjx\a{嶡V/mSHOdmҪU >&͠LI& 4dN7ǹ2aM{&앁 ^=qQN7W_Kg]_Gng}_U<<]iśNI\|tjR,%`oQdyG`$Hi-Y`xe_3a gmz˨YX4O(F-H2ҥ (J9toQZ =7{%Gs6a4c0f6JAehzTpΑHAN?Lv sDzbbJ@K Р5FЊyYif\y&0N -Ă7.) ZB&\w$Zq@d@ENu"c 7>ae4Y1%]JqT śXip%\Xw 1ThO#J ź#%= wc`"tk2{$w9UXPyix!n*7𧤲,ph9(UᡑCC5_IM s\`lRMtd]fxjRX eD n 7N,Q0Xw i &k8.T1`ƸDŽnv'&ll*4Z *jZ«NgN("^3^J@b)RgXj̮rwA?gux2;&?Q.SÅ'8ݬ%Kf hB(XSՅ#2*'׺)0ed0`"2˨@)e8H߰ FTVpS 3rmcTD n,$j< OVM)ʮւݝ`A\WL@7e x&!\1{"c΄K}ͺŖ>Ef*aISAZX2B㛫YqQ/\_+$g)qk|6\%ua:fF"M1(Mk<éPth|h 2v6-+^%k^WQ8領*hF ! :]twuM~om> 5BGI=;L%g[X<'k)N=-gĠH㪑#ཁm)n7%8A0)kSNo{"ؘ6>/rR+YcȓxR 8ohn~ F.;Tjk+y= -V+<: }a8êZ{'GvKTc;Ύ7}unn&|8*)N(fGFMSSrTqɢIaիϵ6 _QFᨡQV?qve^{zpi{1\RWu-)p bY6NZ-gjG|8JItv RpˋKU)X,9O3%̸[pS4܇ur!=?I) WT,Q5 NezP(BLFD2ZC(;پ/PQ~C5ko<\=7X=dNo<~3sg/v BB7W˯o::O#?*ϊ6(۠Hn"-*bЪ:׫Oނ* E:aO@JyUNt@wD_92gEjV< Bp Ze(3IA{;IEs#FUkц &GkJEA%K;F*hBтǛzjˀ?Vw>RYMU}&"Ć>$TACXAWۣ%S{b'H0^;#Ja5PHrgDnh4ԝY)RFψ#`EDC5Ld2 Z!rdj";JF%SVPӄJ':"tHj@vngSLS2قf,\#!#SZ82é]9FcՖ7dɄۻwA+UO&j4e*&x||-g"Їeo׀,EL+\8Dr1`M0az'xܻ< .Bt o , _j@1A]3L}"7(g`F> FqL~Z,=N: fne@boL̓y9׿Ma2Iwc>%M(8a3 Q8ap ɖ>y;,OtO$OqU6ř# I{8(_u84ɚK`-*JFp93#8. j#3~{JA .i4b ᢉ /dC_ \QMqZ]Hc9c w-wՃ)&A/vUq+/)5kFb , p"%9gVڡ.8!\[[2u#OBw9hS',%)FAcU8:͔ѼX.pRpB"vʛ|VKtзa?x ?>D8dydCg5/Yxu}L,IBv5z>O }"+Q'jW7 y6K*:M2X dJ]5wHĿezyjPB$yǨIyx(²¸Mp< Dwl[N'&$a@>?"LnNkStU/?~2s,Ǚ3fC vJQpAW~jq=qQ$6nڃOdY=%jQ>YȧMOΝt|;vMgYߜ iɅ Hƚ}BcMyZ%Wq+,}q|eP~NWT[׮cj^2+^z@ A˒Ek(3E/_hkzߦ3*7:x] t"x3a|;z07eL:?a.\C s\@zKw0T?%i x6хR R}Sh_Lb3Ls> \׋왂S7Wx3pׂIxw1 y93" SLy+ Xa[CJ:<1 3-Eޚ:-(U{rW1Fy%tp*HE;MR@78 Ph*EGA1h#iP0KqP|+W<~\ }09OucRu6#\δȼ2cLǫ$0kG(E.w$-MԝB[ >q`]-6ʵA2%gXz2tDߣ Z">>AgR cOms e۵M"S֯֌ᦼ=R$&/L#[% z]$. cr*scW¤+uJtxVE0U$ A")Zl8e4ٲHfo="AY` DIFi[ Ėr C \q* |* kF9L=N1ŋo>歑"Ge`mS$=V MZȄA:✢FD"5c(Vp9E<ۋK6Zx `r$#0cO\9[~R0[cwE3+r` l QO7<!Zi>Q9*Xoo~~e]ooXp4q'k3U0rWF #W {raaf933O H\*%H5D,?/8~(رu5]Ӂ. |C,^O؉59CKǰocB(bjvSL4][Nn!dgsm&b=b*?޻q $I9 2n`?uzk{~T%u3:gAggQi74FDgQ2}t4H,{RQ t6ia~EȮzN 5%uNXu?ZҚ1H}4y'OO흼awNqϐEjՀ9=$z/[s$ . 56s_[,+@x-c F槳fעzGKЭ734]2@zNznkvQFsLnkzJ5Ւmөzv+y ֨7wG3T;F'HHF5MʎAӾ6M""lOe1DpԷWxpqj^!p3ҿw"%!pOʄ_OخdrnHDġsʯTd8_aגg8_AWDbԇ k*(K5FQ uXHeI|pkA_RMo/˳kq)&Az96:׌M + (qr $>=\1*S?7ܫܡLa9`i}wNx_7| Լ = *̑=ERiz4؂=6R9#roFXad"4 9^9C.iVZn;m428PX'͌*=TiL1rA<%iUnêc\AU{߰ŝw4u8׌qkPDL\<`"eyia[^2aM$4\ )<+6}1:Ìs/$n4Xv ْEȖ,Bd%B,V: tB PA ;,NDy|v%j\W=xMYoy)P)p `x ygM p1rRktm n. aSFc=܀\l!0.3-Q37fmE{_"N9^U;}wOww8UFPIV#3b+&+_%ch渳 F,1/x z$+y$pC/Y=D`j% ᾓicKntc2d*:չތ#SG6M*OӮ_݀oTh}]%_B2ozr]o=j#[ 2eYMޯlۅ]>VR?:u-Xex}3<ykIαwUZ e'7Ik6֥U9iǪZSJmɽKoC9EEs ]К83vY)9^Θ@QZumo6Xztm ܔZ7(k#{QW|z\MF[I>-s}.1},U>=l6D\Ic͛ȝ8tɁ*c{H{<9=@;ㅠUj:9M0i~=AUnt5j{/?ȞCs$Pr!BIٛ 38HDJYnDD"3ʹ8/_%C^-E#.ë3-Vzeex DjXq"9A.c2,JaѲm0RްÐMJsc™QC0 g9G!/rMC UQA.vK bwwTwp5;LoDGZLI5Xі*%g5$,b!K2L1&N˳ʫ+x_In-W8s)o` ;3r0B@Cl:] ID %4 Uhn3\U2qc6M:*YV?ez3Yb- 76eNƂn&ݓ.JY. t.wFE<{B J2 `cq3˘1\VdSJ eZzd_Y.Ln 0ܗLJN޼{ ̏)S F1l)? :<ć`oj?PT0ob^Gj2lư6 Yp<_09A6ab=u!0+CTz'x92Ǡl0Xi12֎o]V8s5[,kcހka݊( dfMQD3 {^FhNZ/sUnq9 `@#2IrY3 G,]j_nkZLqZ %A^ Uvn⁓o_ O4ӑ!\tT?ظ~b-hʎR!c7 !Rpnu=)\֨5K1\/nzu9jmKBA#}9=" 1E2J"o̐QP89WwuUu]U/x2xreʛ=SAYg8fŋ(8}7xT|N˓}5ܷ>MJB=vf}w%Ӿk.ԔR֊$PGbY鯝fk2!53:ȗ+%[燯ۨBn[H_uKps*vc/Yi#I gOT:/ |V{F'lIONK.zn2fSK&S>?{o\'^퍂= XHӏLoәW9)}+ǿYJ x66a7^.|Uӫ/o]u޺Swѐ] fg<yk!yP W<-B;]8}=^vgswjw3sR=Oin:uGef &r 0KTYl$,iV/K0_eϡ? M4"|3la4FZ je4Ky zip~G!Bn\o E r㫴N@Wf88t%77UXwɿ7?YUy~ٰ:,V|nns<y )Cv(ڬ_zdd |ԉ .^q7z6r{T._,#"dKuZR R  B@Jq&ٟu4eށ4Ӿ*oj%˦9ӦE KWMMOI᎒{}gQ<)je:#:VI5׌dM7tP,߻)֞$d Oͪ/Roj):8Q!N7G++ . &-5>ѓYʀIgigJv>? 8}q0縓}L&,Vo{?GyH=g̋dLX1hi0w^LU?=u^iE&43y@nm:'ObRd핷{ W&CHem"5Y?]wh\1R. WL,F:5 %&tyYر#oGyA@?-~]G*ڃF4Wm q8B=8^^Ukc#b ZS!u6;PzO)gxs-qvTK0Q1{|^&2pB]J \(dC :b1PZTR|k< Xr &0)}Q @$Kh(\2khywg?~X v6l?h`.p/XC? 2pRoߧ ~" 8Y|9Ј~o]IɐqG믏pFI+&R4M<$}wPagf&N'2ԥngtܙ Μwq0q@ʅY]7{[GP1G:ԹF,%[Ju5鄀@JnSZ|}EcrRd]!cuƉJH%jd\ |K2n Appm339Z3.}9\ƯV_GeLTݗJ'͡2uDO>$CN>$CR=Ⱥg CI@+{eje:G$A]ݕZ|6-: cŠbn-ixu:p! p!B[mZxc6J$s {`UiRP4yA :'2=u Ǡ A"(NAA%.ϼ.a›:H5XX c:NIJTiPeJuŔ_t)]265 3 ǣgfuZJ+'%TPklw=;~ YK k5<ܣn g;1gAeS[˨l6 "1$ aE{y$I) !E 0AF!D Ii9oox2"YxM_M> r ykYGkzܿz7${'p] fD_"$( 3:Rg$7sJS&'H^)[cLap35(Xֺ@Nײy5B0"eJ & 4dъ)JJ!R!V;2 "%m,t_+-Z*poIh`]{o*8d q{ A .ut+K:z$5̮kiez%)`$ҊK3ΐa ,I*S)0CHxPlgS9LQD0(2p fvh dZEr05B\2.AhIm$ wv1~" %!J+1Pi;k$TQ0mdlvVk+Mpb`Yj \Jb[l)L ^VHU%)a@[cJ$[ΤQW6 c>-6)Q BmiՎT`X U--T\86t m#dR7\%JA[F2ѧeL`Sx]IZY)(J 4$p-9ޢ[żu`j; ޔd{[Q\EL*Ik-UJ-nX5 Ԝ+[*D*-m xK+&8&+JDוV-Ȟv8J֏O'Hg\oxMφ!&\^P?׷h.h8P+m(='槪+P+K*OHhSpƖsjB)ٺfQ(ը&$aR[n)㤶$-q,R$ pC6hBb[DHb"(8T[*Q ڭ/q,TRnR{U{2ehePl˞ e6hcIlECj @rAY@3 lK|%c`'LǔiI-8K K*5`´DpcyL2 x҄R@ D$!Wԥ T7^[8Tj%V7X)߈bEXYz#;FR27j JAuGTUK|Y&?'e=KR\w2dz}g> '6^|Reߧuqʮ[vՊj"tz?fZ 47d2^aA5?ex}Ww6w?,{h߷nyӠN4+p17oQ9S|jj~ [^5iϮz閾\%I}ϐ{.˚0iy5(Cݟx6L٢̣lxajl= ~T8t̞Ic~h))zASB]cg Ǝǎ&McC$<ف5l zͼ{>ѻXGr[ %f'rqs@P[xE'Qs-]7[V^<,'..Q)G!Gc0{_B3?β'噥ˀX.4&`$9Z_QڱƽkԹB'WFA;f.EO('[P[79=Z1`E%+Z'2"oc\JLKa'E%FhB!I]½ x 4%֥ BJ i8]F;'rbqXcE(sܚfD<=Ji'e˞vơ& x;8@,eM> #6])N  Ӵ%FW6fycKOKΆT/?e&z鴽Vj;demlĹױm K}w+N&W*=\JM.;y滜ZCټ`++Y腝u%'OpV2Nmwpy`+y"KWx?c/iGۅz+ٹbeKsUY )XfUk8Lz|$FuR0)0gm- G+ۡ ɺp-)fG8rr+O 1X2k9o ݒ+ui{Pyr y&QFMޣLw#GV]2>+hzB#k:lq|ݛ"0,{Z`5  ,0<tZ,B꾏llnk3|*zn þ>Nyڠ ]0I!1y-N#" zϱ/s ,KټO-sm$i6Ox/XZȐӣE}ɬ?-,l:sԏZb;CJifz`D:ҕ[N:-fбt.x%trYet%i2+Ku}MEMGvxο1qfV%'ᅛmL#* BT760S(.(a4ZN\dJITx$,J3siٟM YܹKz5bf:xC9a$Q2*0LdN 5\y&I n!u/WJ0~ 9dF[Mf̻:8bi v|tI9v{3O?LI]wXr@py Ԥ%bO-u>[ErAO~={uqImݵm}9>|1lo]|z=덳cW=W^s{zy̲7Ey.dY1ΆnxM2KÑ_z&xc[,=V'UQ$zrrGkq]].[Ym?QV3dI^Jjhb|=}9e p&og F01x`P+gܷWë E2ܭ`ɎeEW|+|s^.|e;L;rQ].o6h)A ~OWt>G7 0oU +XCv[hm< ˳*X.F?#LfNRcVdiҲ1AsKsj'K).L|A7_MLa~ AY,gp\j/xL,P=L kjNێQ ٖ\#՘WG)w#c\u3Y#BRj,_OS G4ٍ()OwƖ)(0<^c[KAFwH_i̗En|5|2`'r ȱ~L<۶!7%Jj`)bUXRc.ŃE*Ep+pՊq=R*=H_ Q6e)CT iwYRhz:z.2QQt)id)u V(:i單Yi 0@[ /ObBt42fYR٣,h`!JYbDKD)UE}kI =̧ϊGYl*a!ƕBrY*G!F]Y1EIRD)} ewH,$ Y}7e|?{=CȦ9#X":ku /kTuv@Ztb\%D= $04;D$ZÀvDz*gąYҠcJI;^B䈐t5$$ XA!awoPp8QZ 8Ͷ(DcꗮBb#}:'e}VG?"?c}*e cLЯRcy-!P bU>8%)\G&jB 8¨xM@43H$G4AU0"!1kd+"W5E+抪)jc( UQ*4[\aq p VHNR~Kܙ*pJRsї[v6 prVG՞ƪPbꊈ33F723gB5ܺ.meMD޽{b\|׮o^?lOYrjl8Ɔ)V2mC=$O7rdZF3Ax]Фo(AW Tw'IyG>"M aK0׊TnRPH՜XԲ7p,ҵn,:),2WUikQ(t-vWw[h̪D@JMʖc(Z.]*eԵ!߽CNϐx:-+7˕}?ӳŻ٭KGiӛDW}J@T 9To?ÝsJB-FpwKGwW׋Y;*iD?5b.8PQU Ǩԕj:^6˺/G 1P-VsoU8ĕd_?`\~<~Q{K&ԌǁiĉY1)+@OfF^O'K&@\8s3'0'~F,Ǯx: z8 IlÜ.%֪%ɀ> :{{[BrOΈ㔭,4£zq]4ܔ̀ey |`hο`i=D$rC}IEsFg\BJDz8/ѽ 3'Rѹ\έqNQFaQU=5:qUl\/<~~1peah!ws;'ב'@ߛ/=+2f褵Zsfr̈vSk*{SoJm? Rr_qNJJe һ } k6%'ڡ\29͜>=Z Byfߞ*nujYO ^*ϏP$1.ZJCWdUJM_]s!"Ϳ?bΓ9O*漮bo@r;P0P yc DrnF:RN#nDJ=%~;[T7FCog#ft_PWB~ 7gPeU΢GuP7uYe,5*<T7࢕@!>jƫJE!T!X*EIOAjp>\]z+y!ڨi! lڒәTQC6m<>>(D/lJ\ݕJy. g֣g՗aI}x˻̈!B>oj&5L{F:{Y$y&~^߭*{!o\ş/o./]/778>Mpܐ"(I6?2inv.tMmKqRΗmtw! /!\j \*G9'ȜI5[0{7鯔şn> d7~IP>j笺(ENAz DeN"zwiZQ .b;ӎcZtu26NzcœVY3&>9^yF 0Χ'iA`S,PdN95(O>O>tPDxaEiZ.A{4u<>Ve-~(o%0`ɠ-إ_jGc{)&w94Ӎf7UxGJ ^Zf۳ZHPp;nS%?PxN(_<%*Bv^3JAAMku- q#A [ٟr;O*>ؒ,w.DvB0oW7 g%Q*@z̩3#f=ȭΊQ?+#> )$ZpNBfg] sgH;4{ag).@SSTwK U!x :A8n*44 )3!)"PԯDxH-(ùcQ)u{צJʲr6,!':Z@L>۾r9R&{꜎5|??˂SK͖#ΫeYpyV3aCН@`r\go&c폟_1jk3+u69컫/.oVo@Voe7{GpQqiy;bCq9I\4 Ni`H_ןwqVC~!X#XU&Z;|wvq?"i7 2{]==}9Hqs-énѣsq4#*=sT1}ǭa3pqn}sWIV?naG6GJؔcM#g^OZ )ħ]C# s{06 Aq P̀`{p@}Qރ T1cO ,$ 4(\Mg0p4'wPv"y. Ր МƷ~weg nwf0Xs/]dWGgiD4\Pzƣqʝ# 0^|xiJP>Pq*1IֽbJj^xث|4Y*XJ;OYITŀi] =(gſAD+bXhkA!D​e?{WF /;3E>vX*pZ&)7QEJUlS,_&%/)>v"~M }Ғ8ysv嵇4tΆVW'0,?>]^SFm4ds064C ƚc{ 0t?{M#4Œr!q~'|ȃgЫж{$,GXZ9=mB"2d .4/RweBM.˰8(eX<ݹ27TF%^,\: [p@a:xˈ` ѪL2cTu@DIcbegGbb$qSy_֙b9꜋u gcPtVZu_uaʣS=vX֝'ΕYVA%+Tn#+N1bc`>C̚aF1)2(ł:iiԷmvB2On5%A"D]nJk(-*γ*I[ޅݻZ+)14BZ?YjDTIE;8IZ%8U^A]wwN")i$fOb>m,J,FTs0FdYV_+ug)5Fdz f|V048g[&7S^3ʷ[wp]6 wƇ隽זݝP$2abĜ`eȈ=3Fq 6JfvxL4nǼwg:hSUۀ6q9!2Zi6,XlL:cE1HAq,\ ͲgYumN7%#m~9"`8$Pۂ~ 7wbn_s'aΊ un:36hթ˛p<wl|~R9Td^tL;A]c bU 2M]{!W r7^lV !طIQ\M]p7*ܴ߫d~Z@nāH]7"Ml%EeܜXJ+#S_a.I'oD9gS$0`y@ B۪s 2*YúDT3GH)eyw٦Y= t{O|^ӓ!N54|9/˧*5(8J!Q㴿 NMkZ8?MӴbG1nın* }\']` of޲./Vg[8DR8bcXW^35V1"mEq}tu_:un[r)Jb~Mm0mPS_*G= q=ej17nN%$E"h5 lqtrh]}LY!b3f$l٤+}SFCPVV Iևa մ njs;kTuGUZNOoGJO KY0d,M_/V3݀_~|x='o{ .~4Cor1 qe v׹fH/7r§\o02FBXc$hfJ Hj*IAL*FBqACS(xD,HAVFeYD*YtߒsPPo iNk"W*]} rwF`z55jp%_R[|F`';2ԾG9xC"n .|Fm6(p/WE]x,t*ߎYz ؠݿr |uTi*iHz}b4*2wg16ܿoBUEr`se,Hoa쫽,H4GR^1[RL9EwLv:K W yݠgԪ0waeV}Z#;ARv:rczĜԑ]I:mZEjɟDF;K9J%8IS-# EăeAZ@U^2Bi̺_XYd %ٕxa3kƘ!m4TdV[{7edHڊF  )T:F:T 1B.gvJ M2J.ʶzrZǤA/w.ZC/5Ԉ^L*fYÖwгkR9wԁjîzJ)핗C}g~ _XD9pKJ:Y]9!*H߼Pk)c5 ՖΎ ElI_` h1r֢r=ŪIgcFo'iQcNnK'\]Ӫm-S\=HҺvdZ )!v<ٽ)oRԈfEl9S[ӕP7cfhԌ6 1#:sLt*+!d:fvѕ.lzTטuPivԗg-Rg~E58,NzlOױO&9zNY{J(HJbh!4FtΚ\/î)~K(2^L覕{ܶhVE࠳Utܮo:us~Qvh݄S5n+ӧOCl)l’wp G Q)ds` rfa~ ~z0w`Of OS>Ēt? c)ڄc:xˈ:cѪL2cTN ]@XgrX/1W?&ZXV(u2a  D 82(ɘ!.*扔.0}\2-n\֚1T4ͱeUԪoH+KJ T>qTA0!)s<0m.*B1Q,jZ)eNE1;^Gq%9ZdZ"wc#_(R`;fj9O$O ùKͮ.<}lCko&Ϗ24./Ӹ,e n:iq"R]  ^I|T8~HhG1"+"|:[|PE|e0Xa~?Mtc;0$ /6_ ƾ˟).8,Q妾^[J#g3l~-_:4ReJic9seƻЁG^jU*2䘌*8F>X9jͨq^ 5)8"]Gm9G3hxQpOViE9Cu S(dSsʬ9.k@r3p?Ld|j"I|lv/JUx(yFF t Nr%_BDdg^BEgwCRﯯaz~;A~{psW a@bk3c|;]V#ͥ֞@xH~e-t{ mmh/zjlhb,U“BlOnj&ro6qMj 2hI72550! |xi0xJ!^'V451[LMhSTNmltI A+#4_75RH`b7g"q)0`=4>H=Rg򚑨iXhѨ}t>!*L"JQ>-l|Kxo0$2 Ƹ+ㆇ7-|fvɿØc u-7qo \|(&/{Blc@M0UZnמZ {ʤ1QpM>ֻT!x~N^f+u}ӒV s بb!a!!U46a0b5Q#La6I/|FZXH4>*HҤˎ'HJ済 kc`k*x!V(K :d P^}4# )EZtq&ٶB[xG_VRM⋬ hxSHf)"T11 ,VN@+~˄y=uz<=^`=YSpb#kδ/5c[U k3Z1oڄA .Yp.I{)r5WдF9ԛ Ӻj`M ϯܤ܉-*WT}5T xC =k`;-݆;hTΰ0:(A3OEvP>ќLR:6pJp~,5NB>?8h}шxwyi)i1ޝTy%Ji&ˈ?hz}%vwh:(lR%ٔb_lD`#`Q,=CLAؾ{;^4fӣE}պc7F؞/E 2%yo"));t/ @Vjnh+X58C9aƺz9|*{q}nkfw a 4{=/Y_ԯw Iiέafl"w&y\7U3tXi_]ҫ,!&eܱ얣0-jZSkƬGiU(`-c iFLZى,l;pޡۗkcni5$jљ|O3!yJ h/>s(-_$Ef_24ˏҽ=Dˤӓ9r2qkɜ}:,Kf'}8W/t`\cv^rUv褋Y# y *(KAf, ~\,Hy<9??:/V3Ea"-H YZFj85&r<ǏCkhbr_"!Q5b$_/K{H 'MY_} 34فn{i5gxK-}Y,:y?4"JPpiaVm48,>>$D%J @tYI?HJ8g6(R2LK FIWЉI>~Z. (^OeFI⺍zP, %5U6it:aNwX655" 6x@䶲@Hf~lM$ P܃BŤ]A^Ir%AAR>v;=#h3iyѦ ZKXHPJy59̤h9pE[c49r^C kJ9Ԍt ݍ*"Sl#0x%TdPb$;"` cCx&V d7,b({s Kw"E>s=z55v!CI^v.NH>H1WC@n9VQZYԘI&Ŧ0!ad IȁBM8H׹~l2YbȃӠG\.#*.UY=8)IـRG2%mw)%&Cp[%aF^/}a9\cb0i4 #*KJ&gա Btď { .a9E~?\.J:je_g NYgji _|vEB?9r _O(ug4 ;ޖٟf; N|uzJw){Ω\;o9/Vi۳Pk!ɳ[%~}=-R I%>ң3W]M[*b`~{ȪyV?lZ pF3)Af9sLd)r2R:N.[J*&'x }1IS(aX(m%EBx%p{7[:\\%>LQC򍕺' QN&V( $ѓ 0YK89Ɋar0bP"YR|RD6#9ECt4bZa))O% ڑ; /M ]q8fKG &C\Ld%qCR#CiEf!ȑ8~1ɚfUNd҈|t00e2)@PR>y '~(n@N_ރL$%Ɛh" (͝but+FHՓ62y@|KI˭2%$-L CQg$[er(*>k>-w6ׁF䔓_ǒ-ř"r4Fy$H gɅZ!Y42J}9бD0flCjG YWDyH m/5{kjR6RY-n@@;ITnwHyh Eč+8<]rCkF&8CʪFy x;$A; ,Ko9D9@yIOhCDɅTt;PRH}ޒtcCk $O*Ii9BTo'w7ȑƍ-Gw8 6Ӽ6Drv雡Hc#39UYRh R( `$Pp0A=Nh0 9eQ{. sª[FYٛuِV j.T?_ߖ\ʍaOͫGg"|?'!WdmRH5!Yk S޸l'9UxʸP+%8~檚ף7]*n9z=zf;|pwUEUΔ6>8ݭ Ypw6ֈpk8a/Ȑ(vc,|w-Y=;9/_G/G,;Ѻ"AMT݀Nq"]?ЀI%b z9cNeeYeowY&- pA^=$7QwJk)كv@ɇ7kݿȕS=[oМ:Z-7iB1yt}S#]83+GZQɹ3~tg8юq;k itIG.fw)bo7LxN@;ڃ$0clRj#<(J21xoGQ`+鿲T@ʋF l.@~4fn4yPvq,;`YdTY2^ bJ]^PhlDQI]X.F-E;޵mc 2o<4 ;fiaP$vRUi'_/*UU%[@'nt}DN 0Ylw081M+A.B&v.g\j`5mziXJ[{CKnl&:KCB1igQ؛[SDO*-C@!4@[3E!X6yϴ +暁"QӲt{oCj[ư ܾZ`*W2\]|̻ %3eU I+9}. -<"DӰ1*>(cR CF38/0^bJ(K< bG躰TrI"%+El"p՗TT,X `M%@yf:]u$3VuaԔg6줒W VYcL /Io|@˕Z ]CE[6Yc fw0(fsZA!O5ĻzT0J3^ AaR-دY- 6߇h TƗ,@ڴ3 g +h}q~,~-ŸF8\p ~ڼz6eRwٻ>^0sW_o}{,ZwNĿf&þ{sm&x1.(,h=+tzU0闟,7A۽P(2(>W6El#Ev蕆@QR #ERvd"P nY Xz&i5S|:7wiGu* 55y4{EgeBc݄χm1EgWW>$ rsċ o\gzOg??j~{/%~Muөzj(կCM܌(?X{HO~A} iiP#vъa n09{q :$yLX엿5Hc331飽xC?mWmF3fLg_H 2?Y>Gp *ЬVzۥʘG !|U%(Eť4xÒdFrg+bxp- 2%7J3߃. Hwk:|2$f 9fƫN43(m5уƈK]>?ޑy?KfK(brO:cuǺxx_hT`,Rf p/]YUnuMbiJ2ˊtxGهvL܋Zzw[~[ "-(cwW_ ߖo-/|_.gnΏ"JE#b*$LdlW̗ eSK9ѕ-%*)젖=z, #J)#JV?V@C<2T-yu~OjS)ۺT5([@QtPǯP'YL-Z0*RRb7v]Ab B,pםiTT(@ Fy7\w/<+I6QV)*㩣2)xUc:Ῠ㺷@>nWf?싕!3JSL$/TrJ{K]Jp%'9zӢN@UF(3VJ'L%MQ19~/gOgNY٣UP?-ʲ"1{`V&6ZZЛx aAs?Up4 ~/|K ]|Rύiݡir!*,JBXXT-:,?MPߚdžgiO=vC au|rɼ}A5+zZ΂ /RUW!ʁfeɃE!J(n@mWT6cQO%sySJU!ي- kX;xsFW@7|bL3uZ;,(D4x!&Xf"([yL Mġ턞dq{^p K7g?{M"*#ݛNY2GO0ihWRI Z47O= ?3BޭNcw)v9Ͷ;* u@5^$# '>7$qOA|`@M~Hg{ 0񣑚,>]zD/e/v|1ZR'N[9LX&;ay@k0{v~giNC%ag(T.{vIΞf1dZ1<-zQ7\z05=x=Q'? ' )O蔼S~t]F9XW-!OA{7Nr*uڤ[sȤ$hӽGHPJ=}km|`^`IɸɃ!(e\@{ƏJ'ud].> pK_3\C+kPY 3x$h Q#)S$dH)ڋNKNE4&0L nn}MȼXWG2RCM| Ԁ=))kԞzUnbJa8me,@$jt`mķ -ᆴruNf?/wG'f *3Si7J X wB"l 3?mY~1"G`AYB3'pa :ȃf*}3V#)s=gJV J;HԶ>ѓFzrw͉aɈV}22Cv"OAjB/Qe7tVPM ;py'.URChq HF2;Nq*hOhk&^2o[ܙ)X`9D{oҳC Lf`}z^r2jfGD2!5LXvxӱlƤ>ʒoAU(3~AQ_ϯ.YE?aˬ8sxQЋ=.r Bhr>ty'%}TM(Q=o&7Ѐ"+ ʭթ|wH *El[aoM6N 9 _oxzIDv[aD|ŎLWI h&s^8?WvQ8lx%cA,tE@'*%8@8>d^͑K)lG$l-Sa-.ݡ]v,-xJea}ت ږLʲ2YS/.)n-jHzSgYJUlnW%#FSښy $9:DԐ[ W ΞM!ӣ=׏N=TQ\u?FFVv4(cBW;&mvﮌ+ۯurh]b/`(NN{BGXyt]JIBfIAY?ς֖Sz~!ԎRsE2iˠ]g(sPwH}Кwiȿ_x_6W|ig,?mUBgQZpő~^xarc,1( (,ƈso y)_i[{1WXNݤ*UY2 W۲Mx$yQFjr\%wAz;5%.p Ű/5p8I?x J L{X=Y[r+0<"$*TF'}vrh0UGֈn̪O^ 5v=?8]4T)RA~r߻o`WV |*[2ٻPNC+ O[ %KmNQ/ߟXEŒFs1jR\Uޜ q?, +b~V]\.ٻ6W~YÀ>,,".I=}XL$J!);"}IJCp-FlÙr|1 !FdZ \‡/7_NvKX[5xHO8$P渡`3`N3Ác4{w%AYe6E=? Y:S fLLu^ ߦA/~`fxsá[Hџj؊:,@|)FWUWvSDޥ~M%PWpYaWac)"1⨑߅HiE[m9fZw1uI'j`5wK%\Oq_Zyʦk;Pf *P諲kT3 ΌG?=?3s7`3|9SqK,P!l)4z}`(&mp`q(ohVR6ݬɗ"-?^h*)TYu{C+R\ wD)eJK2UR:pHfb% 7r&dJpSH6d^3 K3l`T6ϧ D:IwXSgS8 mܦB I;VhGaM`,f Sg'mB_E=MVPqaj1&V Z!cOfFK!.)_*$]bST9n Z2 ᝊp+/y.U tvax_WZ&[Zdۓu"'B1a %3䔡Nzܦ\{.׊9WW^!QPj%,NfL)iHs!G))ghx`8nfW! ?ess[]+ϱ9"Y{ۺu Py13CӾq(}{/\vx9:sq;&̖NT,}K|\Os^>hݦqyά(E3UsE2Wnd {6whxw +Ԏ9g#K99[per\x7g<h Fw_>wWEsvqs m--գҥvYKdGX*Z%=y4M̼1'ȏ-XvmlFxCvpjswP58:(n}rV__Gkx0)|>|~M>vk$Zn*)@vBie.8UXĩ.vryN%P A*p2hNE@hct%d!8¼ ϝ߽4nvx_8[7[V!VfH]-eo#%,(BDR3-ras- α"\!rpZE`SŃ*4X_0}ǐ೙ٛ%~} M|hC^LoA=~:8%w :p~ 1 %w}1 Ai,cmP ɭ\R{XnXRXg<v~2[`?؝q:``L7Zly|j?Ҹ  B Uf΁;믑V;E1v D!9X ʁHnMf8Pr4{i}P-1.ZVnyw"wUw8@=RR2 hd* gL0' KС/ lF$u1{PI灀d*D *x:.on8~f.!% ~|7h2-9-8 ?ܚKS*;K~xH6|d72`<{pw7"8i2\QKn(1'{s8FԵEB-2A&x kLIo$ a -컺=!W5V( ʝl34lmNdns LU,FyȉJQr %l `vuVIsm ܬ3&PJe !r0@5IK`>Ƞqf0l00bJN`(#iFz,kI $2[9:20cTy0 .ɔ280i/qRF:9_ʘ l,nUK W`\~N_xe zg5zg o*7RH=;W,Z@M8(&ݗUu-[D u7)p<)`qu&v2P9 umhsG:/5QL6'cpZ;+Q+.馞6ޚj|M%AO]S#O@+u WVRB4%9:!z-A6ׁutYAu8}9f t:Vqz95s.fT['9UQN'.JBl Ba pUD]qM5& iƒQpf|lauK7q* Mӕ $F.@)!;qK]#&O/P\QE,wMz.Eg<ܣpěvo5Y gX;gܠtu\*!2BzNmdnMf38lߕ[bDYIn %TtfL+'*ݘHR%\ϘH;> B-.T+'-Q`?9XQ]j79lq2(lbutqjقr*q2&`),Y[.2an[H\FD\@~ܚ[[^4v]ko9fsFn'Œ>ť^;gӃ;^sq WrQeqi:sY$!b~se%ӎf`}22Ypݙ=XpS{ƒ\L;ނ&v?VR XCmn`boAD7G u{۫;=lM8k qZE#lU+N[",?z`:-ɩќsy03͝ĜRf1G 7WLk{rbs~HKdOײk]Ӕ)Һ?ZW #DoW N`JˊD-IT\9TTG~uu}qoЧ &}x.Le=03303&7+^_+VFruօVͱ._E`ګ?wÊNGq|ȱ` C :TR[ΦcJ`9)ݿh$c*\h[.SY^>JMG$˩/2*V$|^iоNÔsb=kgJ7-%8]iAv8"Qk#Od8Ql:'*pa8Щv:l=mIŞ_Ԝ;j+_p+Xq\)k '3AE2A®m~vlg ~"xk(#wң`g?=l/89aaM:T6 ٬wBzeY?k2_H^v-$mZF{E.(F{Ff1OB6 e7/5DkknH]Uzڊ+J%G5HX^x] )")P$HqT"랞F]2 YF.g/eG o뾱>d+VXr)u=[E]C;h "XBDN݅p%7Z&(kqQk1 QXJI xs 5!,0Do0㱥qrXb#bMjĥq„0"Mq(74n2.î:Vyձ Qzgfm=g[<[v%^*h93[fc Vg$0]P%_ZJlZgԃ!j2:u7(%n|d%Â+ ^bTIt}չpr堬Wr |8xA"\k4E^;uxR n<`!= s5X{_!-Dݿq&U-;+gT2j9B=a^tޠT_1"e艅W@l0[x;*A`uAx4+4^/vcuŧf!ʵZ U՚grM;CIְ\c$bYB8 &5 'Kɬ@5ۘDgcֿvGZvz6 mG}91o8= 0ϻao\fZ"HSNE0X3M>ۘ<'{mgP[R?hsMw0 x196 k4LHSyrS;X(*\Ӱ& بSF|/`O6E%{mySNRsc G*fۉ̡m{.:1Ʒ{ïxS%\lqouF3BR.ѽ!Iv*f_v߂c=%x#t0_[Π=W9o.v,*S\l1֕ﰧYĶ"'9h%hM rEW)\PnF69/Zw/^]_$keïyI6^/&)h4!y²_ BlsL {: NVN-NRR%sF=QTYFtjaH3A71G[$D6ńVl#ٮ {-LB,+A2Dx͑:?4_~-K3V2̨]_U%M@q 9)kGeL1~1kĬʳ٦U| V^*qO<<ظJxȄ+^|*1=}bFTM}bfM`9x0`c%a1K بY~tEK7R*tL *ȻN-^ZfZЌb~vҍBAjh;~hs18>7dqVN4(GTսmX.Gms틌:Y1w>,}gÇn?>eXAw}dO"7ݟ92_s74@%3ɔE7`t٦ ܷ"ۦDrEdksc01&9.spnt$Y$1/׃bzE1(ndmkO#%W֏ǿw~B }?>|om'?/_\wo^\fZ#GX>fn{yO? w;{\8o&Tn?OoyΆg@~hl;_Sm~+GSh;zZ`79 'M,Q^umpν 6쳟P VԴű w/M<`_DF)[NPSvMa;'=>4ʔlp1i-D8t/Uc:5~|Â/~{|ؘqzR\Pb- JL?ɠjo Hʥ6Mi~+t}c7%Ѩw7{9 \*caHl^|=9Xm#)|iLJ I__Jy&H4eM:NMbZSI^J1E6,YLjg譁X Wrɐ+Zm ~X&C|~h6yX^mWG08b T'y{j:pVj|kvGVp[9?F\wb"/)}wG[vF^Lܟ`#GlOƶfqA!ƸB]+*7NS0F J&&!UIj4&a*t+wj|8T^mb߰ 6^E.nwg0rYՄ*![OkL^ӄum@|IVrT鵡,oIŝJllMGB3dD.Y B~Ω1`5D_ތ￝?ao܆G 1O >Y8\K 9,WY^%L+c.8o ')\IPӠ%c%Lu+8uze{Oa#5f0 Njxg5y⸈!>ݝ/a)ahxKS~d)\+1 /X'$ǓĬW!1UHz^qA0&8aN($) !{RDNPUjU&`x%>r,dt'˗GUJ9|[ߞR$nKڤ8di87ЩjPƁxi'2)SUӚ0PYt1`*H YJ* *qkS)AtyA4Fx#tă|Z.ɔ5zo"*A*EfE ïw5M8BhSN A.ڦs*i[ >J|gnh[ KqL1J6.;%*%8Q.!IJO* =fDIGmBFzkzhq6(lԑSmn 5V^1#ON& )9*}NpөBq7X)9 y3|a<'q%Hu!hp0aqU|g2u9S{)I9.o 1 *2G6$lU;S cfWLFC2z"Ja< )xߛabs)6keaVUwʷMSuhKW՛ST';2 ֽbTf'){z!m!ߦi@% ~06XH, gŇ4:j;;dhel֠+ٰ(\:)Shs/-`掮& ZJz(z,d}~r1&d$~`w^hwלVRyٿruN~mѪowG){чeѳϏ&Nj -mcRعMWtVsNei3q6XJe[+# H^0VR4v*>YlN\fjgx>$Y;/:HK>X[P*/+!ID*U9L┷ă7Cq cPQE SMD г,qUn;Fֶۿ./A}q=|9?PgXr1Hʊ5PI8gJ)yQ*ګә"[{R>oM[B6c4 N8_s"OO-->h8#d=p9|^{62ܻ~pnn?l!gU*6bӆL'" Mf#**%;y w f$(vLZx2ZşK5eO1!JVƳvݸ+\9Hek#hkʪ#3& P3*[6s@Aœ }M^$w$Zޫ+lCŢW $øz|B[eE?·w֨JXBd .wtP9+)o({ N㤞kj9.Fzxű6iܼl/>}-B?iy*z$1yu\h0$BvDl]0%XPx bc\1}sdٿvF >)3qK?[¥2Uʫf }~Jg ZȎxcnRW#]qj4/A8wG0a;qmITGjjudu5ҙjզC)::#%Ra3#!M5 hP8SNhy4\EݱȦgeatNn_C uZt#L(t]_ONC8O\zWk]/HI+*MjԌi{J-D%o 4 o E%0x^\0vWګ(e4Wx4QR1/lNEm|ƽAq b'tf *Yl̉=cJ)6)qʾ,vNZ P)a1-/y Ҍ)8u@rK6؊WG淙(jdK=2䚨!Hyǥ =Ԓ>t=DMufꟐ&BttK7bIK R=sBjP 8@zJ^gZ?Z-KQ!)u`NT&YbU =ҘkEDvU"BFf\Nj&NTmX{vi%o=~YJ7;A?i`Lw T5=lqF4ag(Us!Eg:h3rp)SdK'V0&uZ(֣+c5j (`zfjuDV(48N(4ÅfҡL([)4su8K"""la4>|*eQV%.(iOpI4tkzry锪;rMXM7J7 X 9w @V L 8"}!7€8S.*EE 9QEq-R}5@[!jGo@S."!r irFY|Y!IƲ$#᪆Ӧ8jѷʮ4~c1 8\<XpنIvx h2QhTaFKZ-)(Ԓ@'ZIZ:|K!TN~) ՏiLjY c!KoSq?n4w#Jƌ1^䒗uA{硥dcQY+ʼnjb?|80B[֤g#1 cb9 y"t|L$aJ\vF,8P&X-%Bc onv N`481 p\45;a/ a(oo63LHeY;BB$ӌ8f7jXnb0z3O+CϾgaei-[@>bo)rКX5xbíhO~x{l;c?3wW8Ng%?<۰W[_#Bޔ_I;3%\?>c(l .4 bb͈sL䇮|(}i'GQ#RY{;r$?b#j$1hJoRj.~YrC)`70XFE?{7Yem٥LMVȾtzQrp$DwhG%AVc$G2 AKğRY5ɗ_,>?zV^d~)V봔WnҚo 9ҿRo߮w rCEg]lzmK C nFkS $DjrEeeƩbRj g?ez9RΖ= -$#q/Yz'mg O* z`-=w^|Z7filvA6; 'S68ӳ)T>3êգ:7:"[=ՄC'¡ǔ:x7A%c-BX YNSo uTI;RD_PZJ z{=Ell \#eqHҊYO;xxiIXsp8ci0\t MAH`>Pbs+,0J$&H?5!0-:03|qd>㻳Rױ1K=M|,Q{":p;]v].SvܬvGړHeJŘjɉM0ut e<}&Ҏ/1S>YhMkuZ[cD. )3 =0@@t'fIC4;m,Nz¼!c̘ u|=vj;%ẕix*"gljzzf{\~lM;X^&5wCb0T-Tm;S՗Avh;k..?7>eމ-FK>411 X,>ehWj~Ai mےo ť#cGMds?n͑n{oZSZp;ya6TN13`LIbN[g3g3h+k㪼(rbLg*f€A\AD(Ѷ)yj0Or0^S\h1 N)LI=:#nkn7n&Q[I8͸*>L+:#A@ Sb [r*E"fw E 'rǐ G=R gzƑ_1eK p;ٗ {iѐD2L_sY?ۭ%.cd:v"YU;A ˵d1Y4Z[U BK.d' 8kB P JiYAf9f a`0 uP\Y2S d F*2}( SVa@+!|ČuS켱o]_%ixuy9ӺMo!<^aids ڧe%-+G Ll?)Мk^#37:vx8̕&@EieXJҞ|Y1U譈amCISb\3-Bu&椙lfxX'Y~ru8[Y="+2莉]0V%f[wk=L߿_\X݆f\ L~|*0U5qYQ-ǘ}vVk?m(cSyds] 0{åuqOb/5=9>}|w_߆%f/.qnԐe1-w!}[_^S.F.*.ӗ*+(tPs^6w1uMoJh8}jS+-jq~/=fm8)`:2(r}Pmspjt˔׶;}}esr_O xo^~~OMRHqz|՜vV8VœC^OU#9Of$NS83ەAFOT+\xZiטS7;wZ=ߪ? `smu!H5Q5%m %LUtCUp3AW8r{9 `Wں5G[S}iЫ]^氿AK/!dJeP${r` {'Aϡڇ/rٕւ +r N G[Q>ύ>6wca5uf"ϸ@,|nDHfrJe}YHҳjPHKUEf$r( 0/Yar!jG 5!*L]͵2/ fLS@C"˃ dex '-鍺BG밬 b$`\!`CD4 \xᙅ B?䥋ikѷv9}U`8Ti3@5X<4FKK(Jb&L KQLr0L8nyfu Lr0\[L BH%wR4:y7-IGʝK^JQEX̿g[5"FV6]6[#~,QC؞Y&A?Os|bN;bNrj]|WLn#`,V6>"?x?L !S׷/oGW`}hQP”\/SPUhHͧƒZ[ЖRL;l=ZRÄbv0k󃾺[Wz4$8|//M9jNDAυJIfPKkG.\ ʺJdgR3'+\hai/.ZLN $Ͱ]5sˁE=euIhu]VN]IR 2sˬ:ai'<3-01APr1̚UpV|15e'Zm%Fm +60Ri_f?䆐XqiAL X%x)v ct畼ȷXII,5 b鴤 QMtZ/ӥZAn 9X+FCq{( lNb0 GV۰Y~RL P@)YcjTI 1Gq9Z5z%+ ?'Ug!)"'=hWSc$Y+H\+򵌋 RX7,^M40{{IQf_*ȽV^pI-!RdrJ(J!"L/Q2âEz|^%낪D:T+*\fTBBKVi=/y? ˹ BTd̒U`J 6&'_5l'>k"-+s K9hM.k6qӃR0Zm)c?'d]ސU*=PhkM#x T;;МGkbP M32kR"%-6"礥Q!j-[dU1?e8 Df&~ ÓT?KM+ݧ[SǒVH^ JV9t jʻ}oaY`t<źA% YUש HJy_;EF-0Y0΃ѹ ^h,Uas' Tth%pRYAN`wDw)5XJA׷HA ghv㧌bzIP dPKm$hJD:&:! mEp߼ʬ'6%qR9s r`Lv=N`Ў%*$Ǔa 3j h<ɭ-A6G̡JSȕ*$39paoJiKTn%X<`oPܷk"jٚ&v-ܷny@EeaTK%M15TR{myx뺉~cӥT[gأ7E^K47M BۉKbXqS+6/fUc@I};uٛeeZ>~ٶq["3Kxo-~/n'slJHI)Y7+oS;\? `3Z57at}=V& U[8ؒ4eV)AeK3Lf;1HyvhHeRLj:Qmͷ?4C}>k?Ve bLv4PIS5 :y1)E02>S3]^BqZRǛT>ߵTIa7*#%Ͱ"05 ˠgsj%hnY;S [*o-Ujz]w@$I^1ob$&KZ7FLOͶR8t}81X3S[!IF4}UاC/&/cЀ! , d8+ UdVЎH!j3LrS M 9\iR)ֳ{~i;.9VsSO\fj3o*"y8n802S(48GJhNx?,nOdYִ9JvY'" 8zO6sc)%#ZIJr~TMY1U:rӴ6NEítUz2-JD~GRg!4%mN^eKnD/K:@ZMeI<֪&ë+{\C±k ;> Q[߷=R@mma30d`ZZеIhꃆ)rn!c9jDkwz~ PWt-(5l%u3pIuks.R{@xB~҃ؤa`Fs jmI6z$4}"@ߙ|'epL/f hr9GjA s.]j w.74?wߙ BKeo&$dY;(u!.'Tr􊼳=b-gt&Q'5MaYݑW s£'za;9ߵX+&Qkh \*>l !pRKx u0p4% ԕ mTRJHN9<]cerJ&$@)h=[6Q|jto?o_6[_Qu?f 0c͟dg'zUtWщ^U}ͽ39 hi!=h.uν1R/ P 7ofoV}?wu7,򋆨KX̭F ys*~^~]΃lJ sooO l!z7SJk{o݆j)/2O6vlݴo|Q9W;hh^wR{v3ZTr`1Ɛ!gh@[)ώ}7@)Of+Fsi}# H||1w7- f!uk=}=n]txcxÿ?frO(aPk%wRቃSacPXQؕ +!2,ɯߝgzAJtwC0Nz:y+7>-( *g5>@פ7'(uB3W }(w@@`7hfx0&%A !T (`/"L) 3GdĆZ +}74+ҷÃ)_??t1ٲXh1DINi`\0'?_oae7LJEo_jӣMY.99)%ez{'f+WgnZWW͗/FrBT׽b_Qc}. $]Կ8L 6:.[Ts_ݸu5F=ڊL/69ƕn@TSU pGlc*bMΈʟk2IFi"IYO\1S3i;۔W7 ABw> \laQaxuj m*4S 퐲VآQ+9gZ ";Ȳx@FFlwOi1,|s=J;H%QT w7__uPHǝIgL!А3nO M7k:?mASoQ:(٣{;v{|tP3{4=:DlS_'y971ƀ!gպfz G0xOxCy B]iH B_.y5K*R_l(*jWu]'ݠj6&DJɡ䉽\#1=&jL2/0!]֌sw=.B;?,l3ު}û O+T27R4W{ozP$8TLj=OqWLj$R?Yخǎ-BvhB'k#y5:=PiwfxaooɃ}n7qM7G*u1We : XY}+UzF%Ҡ) t~XX0wR|.-t"Yݎ v8vGδPCQ,fKä6<2선NJ@xPi)D~LDf>uQ+{5Hho!э._}aط|@ŷ%CxBJI켝`]\7|q쓹=r賴ҬUg˱7@jHu5IqӓT*UEdkAm-h13NH)f3M U`:BGi_f;sYJZ-g=)t2c5ξ9Gs-[3ҧ"Mv-.ds"D db8V `)Wbu԰q- I[qǠʔUPQ+LIq/LYyAaJ6”@ߪ!INM^f)uǚ2bΏ뼣RE> >y)yg߸4A{KFP3xY 4h2 czvu`(O0t*1̛:o]sI[6 Ùaؽi%F qTc""SJ#1'\ q1x>VS&$i,P"TvH:h c^k0ҐE69if"2l? UT)$ l?0 &[5fd$V[ÐOM{O5n.v/=] TlH2$yV*cLrVI$x r3RZJ|pxp|;L]c&|n6ó$?w<eL8r8Sa kg H l[&r{CiLh!zHJ%Мj ҽ$헂p -PsתaX[<@"#WiK Zl?\'icZ8X (IgM`;iL܈ #0>:K}04,(o$9 ƀ(SYmG핚AwE)G?`,:.h ^ ZOk#։mvJ8e*͜i0Ue.-8,MGa6;ے2"cnIRzJ5Й@+L: iϴ_[RFJa>݇K<_OC1I;[\j__,p"]\ W,KY_R )p8E4)H<@]฻:2 ~&`S;E/ L6*xJa!=w;i9"Q$yPtNrG#<4V`*9IG V`tZ,8E%$5Qs0_ґrLZrTtz N08VG+ͤ#} fQ[2yRFI Zܵ6gDMT&>.JBy[Ta>Xr5?6 c >^5+0V!b* {A֓}C{xe2CDZa+.bd˝ټHy@o| S$D+v^xlys"/Ҷ9?vyb'/ᇻCA}tIY?6(_jb$']YjD!ͨj짝Ԫ{ġ( hHTnh&$ԪA{$!v )aRy`ٝfppBpu](<ّUzG}o@kz ]@x0ZzD  U@I1EnOܟz]NF$07T O9_\g| xRH_=jB/7-@jd^熆E 9EB朑sKؑu*W1XMkJO%k]&(? ^!/=ʎbŶ` k(mmtxō-cS(RO;Zi/T#wLYEg#|oKK*^rp;ބ_iw}8Y-cc֭FS[ue+o^9fĵݽl\]m>Op![.^ 7(Vn0}Ͷ} uvsgC~z_?[07tԂlo ~ǟ ֤o D㻬[t䛱֜Hޗ߃p4YK UAY Q/ JA!/2V%M|SXGo]˲ b; M3wJ_3nZzvft+}ϟVyA˄`!K@I'wa1ȀD;tȊI>f4n],Y}ɳJ'i^h?G8y&qw>QjaӖ1NQGMR*kPtȎݣO3C%1 W-'ʯIDT9 KuzA'{:vq(8{^lOrASꫳyFIҠQF)4xg7yx-3MM9wM}+ZhӲX C8!)8q . 0Y\$`DTgi'hy J9C{sDFr2@gj9p{ٲ>m 4hx.svD-HZGw{a7?|8CzʾÌƖ8>r7r?e'Pj5-gk.6=|zqV{FtJ lӓĴW*̒-jh 6k^Fy/_ _*R(:b)(j?nJ{`adVW2Q _N&ӯBvk@J4WgKaQWRuru$.N'Y B; \J5Jo J2EbS\nˢ~,W6zK~YZ1 }xgd O2SpU\qUW\Wjˌ#C$Vx' ϙ'ROX( ] >(rզ|"c>Ja6LzübRlMk;N3,CֻZcE{9N0 p!=iT ibZhq-+'[-gj p:3V/"9:v@Y5wyxGAP $i,0;,/Ob5ư ZAnm#[.&.cC=@/mebj`P,[N2 LQ¥-yKрN d 3`J1:Xw ?ȵB] %34дx=nP_*&=WRWO#߾Qg[-0^v)o7/'Qa??gf.tyx=noWFzxT`>ـ 4c9\og;&"vVy)0j_)=х4G2s4K'0RWPCߺwai+q 5wf8>&LOVv<9Vy@[it,FPN_+)Rc0h.C_K$p\:y9ysAÓdW&1M|.(%DUQx q+Y8BepUl<GS\O KW# ITl 7e@JE?4'M ]t_ :Xum_GTn0c6~MA8-iU5V1%R]!1[w~ c_\g_k)ȥ)GNH?e{D J0}yN3,GO:!*#Z09*wJUvzC(nX45aSTVز#Ƙ–]f 3V@5 q)((ծ0Nbl?N7Y'ϏCGY\f! Սet>f&rږ?X'#b%-4` ֘o  :+1LG7o\a8ȸ$ ңCfZyu*Ѯ1ߘߚP{%GuA8&pu|xPϹWT(^l$u҉p"n)ʡ@gԢ)(I +6mYHeըRfId`(y5[8W/|})ς~_Oo X?<}㝜Q^y#7/Of/xpF+qs_ޥU?3(t!*f&7)f~gd'ӿ,ycs@LJ[|$pm3k\_=yg$(GDxVH>0^7k )S )%e"9ZKt<=zdGEЏgnRԖg7ԨR8?\x82>{W]v.+"VU{&zga{!Iϊe5>+ @ӭ/_ڐ5h{ sYrQJ% Y ͵W?oMeOȦd &p <:^򎏒{WHpF93&*6ʓ TAkǀu03$@6C-2E)V_>B[+ Ty5j< ?'Nh/gԄʹ0f&їɉj_:Ac֪R(It&.q"S5FTj߁RI02ʓҍ'n+3 sR[=*Hui~G҈3Q $i}wKQCR?\skb5:fJqF 1#!7Q%:[1UYu240pڈ@/w5ap^Tt. q=Jp^nU^nUm`@٣^;W%AA 8J`hS\>0"r$!4{04Q,\s$-VޝvK:Jloqi \"3aB#+G4yi޽=`+abf:DѠrjpDY/ $-n0 ɭ/XZxW˝4[ NpY))7^h+Z{X/5H# =BwRcW;$]d%|7܌=VURI\(z)[,iքdHw䴇R;[եSXo6n~~Cx}Yc;D޴*NwVq6ꮊS_{@O+5{铭Zysyqh]S/YXHnDw?9/7ReWw<ګW&8мtCSWJw%D7_+iK@w|Csi#ڣGfhؚTF5/>D0M "Ͻtq'\K nKù"FZME7k=b*57jD5=Nvk^__i#>ZuھuT-Ʋ-4 `kzzPsBsԣZ%T686pUm IY[+b㌥ɂrjSF[d/B[(MWiҼB66XJJw;uJ33aeF#uurǜ;D.p+NiRqP)0f "hLѬqZԴ?7TȥQ9|8rAj Lb,{ w+2B=#TfЌJTVy:1"n@_ -{K*t)>8MrәWQ|Q\2F&hOq*;ʰ0k(w{:nDA[E  ||kI/7L zs)tϞ ؒD`nb B+lSU>;lk`*:*{~ΞQgd_w!,P%y2U:Y`={TUnZ \2߫ "mcyW@+2>&QdyO=+ J,呻˼3 L2gmmugm `D:x$rj6t&ZT:cxӋ%2Ɠļ":q19#;޵eiNe*BJ! ﮄ Ec\Ecñ%dU¨GPU%`Oj [J~j<{cB~B)]-M,FszNM95)40zbbXKi̓ds nC9`Jhh (^pk`A"Fynt=FOpuN-Y][sc+*!{$\@cf_Ijk/vKRSh,Q2I )* IL$4n4@FZtU|s=o gMʾ/arsԶw{w;$ 7wiܮWetW`T@8U!WPIYB`˩ 5E!sp!^)<4662ژPnqQZuIǛAPNbsQU*UUj^UrXJ@RljOSK%h?xqUլ1ˤ(֞ M%UU/nlTOL;Z?ܙOU7W-XW}^D5 k@ *`@Ҙ&2T\dg}UB PkzKY--)ƩJbp\[M Y L3Oldc.z:,բdQ-5ߛ3!N+%s'>,˘:Բ; iX|ՕiAm\ВwȼqHB|sp/::bj8Ruf|Tl6Ror/x!O>zOV"&<{5ilFL 2}ج$B>y:}O{-LSJ&ڛ8&4e. vlͻxg JiU$ : U 7 '7<mLtgԇ G ?R'^-8}Ei5hAՓWɂN|c\l:YmU)d5dU!d.Q29QDwl $'@FbD怊x%+.TVW)udgݶKdJޱd_'P=]k<{j@AzV6'}h;F6T4}5}s7+cyw Ϋz=R}{Kw"6gNT}/$4 zy&XXn78/Bcæ׵.˅xzE=yb !=3ÜN U |ر5s`=Vu;fGv6!2:<8 @묷vTo7<E\|N>61-i7:60o=65n-^Lꐵ˺QRZA9 J8 |, D`w""ToH0-^oq3=|/_zA ]\B=\J>+M.KsLfG7.g?CxԨӜ詅x*P+mUVZFҢ"W|yՌoo6uI ׋w<*z'7+z(` ;ڭؔmoئgW&!ץZ@{M iǛd {@|{>*S-}AB  ʐVeV)>(e§Cq{vܟf#0ϒ)ru/zwìH#c8;Na6gFy1VK:/ۄN+yG .8:z"(hipL&Jat2Uuk9 Т~IAɝE;N? I4ݗZP˪.,Q]$x#FhMϓ H(e3M#c@b+U,rmJwVrGr %B!xpL]%&g1!YLIfehx~D$dfG[8=esTkiZAlc=vJЭ=B*A!4|TP)bjնcK_ߡL^,d.|gKE/Wn$ExBD^V__ScHCIҼ+b rӬcpQ8߇eqq~0]\zw}M*-iN~,*,͖w85I{Rv3圎.P{ "uJu"ɇvc߅8-1_eߗ+xs$|o,JjP@\(@  Q:-(^ W2k-ɬaogDТkk?% RIB1\`S"jNx4+Z `\݉-4Dt0 )T, )tus"B^qIp-{zB>~I}ioE].$Dӽayi32P탦[s1G4=1rM m'$}i缉aV޿{aJy ZԱdU:%ȵT\2t*hM֨ 2D'q3 d/"^Za-Y$ ڛQJ/T@$[.}HL $BD3;A_%s7/9fT XL_h̩#8Kv"d/-JR[<=eW):$sgǑ(_! |Vb+2=w܋޵~H+i}-iaok= |ZlXZL ml@+zx쭻ϞX|D=i]͇0=Nw=֏w g=)mqWC&]ׂ}@;y?}v9̩]m;dD*@#JI Sͨ&-Hh% d'OnDj@uKd $QNtV[Clȴ*#7\(!֒6yqQ(gzQ'ŏ~êKwg^#q{97?N.&c 4w?ل 1B_ϳ>L\?yLrd֙|4f/ھZܸO dz4Y++a /0Mw7/mubn=ST9?B޸)$G{71xNw4n NU>XޭuGK[MtbR*~\{;= (AJ2 ۶XUj5=8EH3_y7OTPSkdwۺCc%xp@BUZNW.NBHZ9 5 gB :J -&PZ5@K`/aBFk]"sHDbi֛;ܧm?.>ŧI"F缒d#.*8*`.*ksQT Rjኼ|w}yX󹍑*f%ܤQ*$ցQJҨ]rޭfzI  ki \B*qSU'G ,gTA,Ҡ!Y= (Jd$F "|(O8 8J5IL p#k:ƼBKWD򴦒;eH ]Lf U”2 k[ x.q6ms46X#%!;٘ O 3AJ?FQ'-)vJ <^6JRj Ok48fG|*ZSV% %),mv*j8nh19jƢ1ao26iqh`8 GR!nұ$ZB6574J)MXD(&v_"cT*55 + 5tTsU}JO% J嗻&"~XJ7Z`v[r(Ca])RJ;;BZ/֩9lW cJˡ.(Odm!KPճ[? h#X-׭]Xgֵ}]TIy UH@Gq@iZ!]!h3x͂fiVYP`HYmŝwzvWS4JxvOE:SWR'PNpmz|TN["M9)FCbB4X MDpJu[79eUγ&fS.)C #Og& "ˬczT ]e#mh Tnwʾ u5!?Ge@*%QHF| OSf+U tcm/J6vr C!y2~6km{c}sˌֺg7SN 1'bA.NĒq"6|;hg_{,&9OTD~_et TXM5%iu񊥤aJZ (;|9GrGސ.BeVJKBnv,ZZqkUp~;r oJ9z2Ec:j;+`YRڨ~V{YQB =t^;PR'gʚ7_ai=cHuDXv˸CQ(Z\S$($A #wKeeeVQ2.yG`T˳ } \ًy^zC}σ|qj'\![#ڊ V0,}Z%XѺkI'o2N[,)jkŸ`a3;3FA&gDohDt=?ƒS' 5; 7v%Xם{5NbBL8WL\}kk`!_$]l|{t[=:C ~v^ݩ|V%e5N<qQ 1mYpA6>o% ղ*>t#+cĴh2/Xp:e{MgEnVFTNX%'y- ]\anzߖ~'AVgM<8Xi:ÑyHf.{Z΁ FS'gGMKa1w&Φ /K K.v77GqMpe0Z_J^z/f+0ʤ8VM4cCxdVNQ^}4'կU)7~L'r)#27 BF#@ _`/uz#@i61&W[W3n<0xq'z$1zlƋd1~e>b"o;Nӯ^40NzW_`ҜHaq}UI\Ûv*/ӟN}KuCԬ jr#[Hcx(66Q0VD!w jy9l1&۠d.$HӦI^u |!zZUeQHuuYm= q36w/z}s)FٓK+'dQldF\KnU&1ա1r BE6f%"%9!d %79k1U :eP(aBAC4c,ICD:"FFl,3R kD0euA-LQlW,v׽Ɲ]D;EK۾_Yw= KϿ+VY*&P6$+Wc<*c2s"T0.: ;lb$α0. H`B)`LBhEڝY@!ff~ŭF]2jAxIA `(3gg6gg1x4clwe0iNrb3xp<)1Õ7~>ilk׿"_HFhQ8Cz9Il,e[T,b6dF0=LlpAbM3>4F&,X!N"¨S]uJoLKA|~Jνol=?EաNӉGgK}{dwfg;^ #2;"Vl!aHG_<Y+`69͘`n[=F~!5]Z0A Sߕ1!r%&TCI'%~q kMELQ;wtAUŠT}FvuӚvE[E4+S"2寻A5s*vW\~O']?$5k5ʇ?/!|gs0 *UfusDmg8YDR53 ]bhל>l"nϮS$[G]jqa]ӫ0LX&ҥ0V,0LE%Z7a*bTokTHC6`z_ƺ1C9#QYH" %QrÈ:H!"AiPj#b U)b}kq䢪#D"SP2PҘX5RcK#EDˋQl,aՉ"jV|lb qLph11q=0%Q,*`K!UEbmݩIX šBa%"n0v`fx`V.. -.䡱&DeCc&W3GĂX@j%ġ,貀SCr+8} Ix9JJ35:O`P4e 34g&Vb6<hn9w\7S4`8Fx"腱e;-%c-֎q4՞ a3bbA n#{{%7i`&3( i:_ ;L_Qp,|MaJl:/ެ~!fwOv̆V?] q~~5Go?fU-^&ϼXL8tHRRFqo0=$N&@o[~ՓYAFt]Ѧ\eٌNęaoDŽl8eԷ|Z e֋\MŢ޻?rf/?sV*ʕv\nlW5>GCJ ̹HIJLQL.:w.4 TnQ/UqkaYk7= |3nsQ 9W 5sZs2}^zqZxߓ!-5 ,E4ئ܎ \~z{EH]}x n꒏~A_wf6nPz?PH_=Gz#I [PkwoH + d{CgAVTϽw޿A4p//S>V2X Lvg>nQU/5FrXp=xn/n:z{:W6OMGt9fxhGYEnO:.Pvn,>',a_B;T+6.؋xf\mX}&.[r> i +.[D};dFD "4y吂)Xjye'7u >< ]pb.꺄 .`5^jN2d&$&'7yx(',4R;Wogp׭DVI5 5=0XaĄRzL_G4 ٸ+H%2-CP=I&)lrk=HTczf>FI!VUe-S *.ڨT₠*@),څk7յq6[t@03SP9> 8k*8HH]>072KH b"SJ @0 K/:h V+!K)CD\ m,\ <A8R3rJԔ'L+ Y/rɎ3SbRLIeOQ6X3졡>rFbets:YÁMFW(\6<֑VƬfE@q" g %iVa@xQIm9.9ؙ(}X޵5qc鿢nm˸UyJ<ݭqřI7D)$e;5}H]ZdDɦȗ8[ w8B葿`o{ Q\odZ.h wEK5{FW{ Jjzǿ{nA~,f4^4_|;㛾tlMY43~H_?A_UpvpUT&v6#7w 8QztA52$u9Iy݅*%hjoYU_SEV::@t)s$BNjc-<+a E}u*{rn=TM`C m(q4k?OM$`,!ݥDy;J(&ו].v:岇d,RC :E  -J[2F,b;U]93t};N#(2h,kn? p f7;Ėj")0_EjcqYolk/աzndȽKvAHKM@ rN_ f8 jJ)YQ*sʨK1LKlA&P^ϮșPl*')ş"$E9Sfާ?yG.܃%u9WLK1>S1>S5xMFa\Xbe5P܂'NbUF9kы83ޑx?q[(m_\-g>0 Pyb[| X*ѽIг"?rWJG_m%|k"#sƵQE70%. seCR3^G}NEg.i&l<|}ڀmM$#Eb(ÐFe)(l$Św8@]QRK]d[B Ϣ+3tvUԑH.っ4%(,Ij1ͽDVe#45Ͱ]fMLz%q1F&)X2Zl4(ʙ* kM(\' V"ñ,?h]z=ZAs97̄` к92z3ұhLI ZF؄4/9#5:j1tI=J ;0JRcS.Ȗ譻$ =ՔS% qN*9a8 5Qh@ùViBH03Η!\=n J%U=1!%sQ2w2ְ^j"VBr?~01R pu<jV38Vβ {ρizIP:]iFh њh5 H3-#?_mL,s "ǘ87Z`96FK"R`=AKOAR 5hp#v"q%cΣ)rҞ[ko}ؘZ wXQal싚 %/5GFB-i\8fM5 CQZXh_imR7y\*M4g@ɣ=y0(K_L1DŽhc͠Za'cD5kԻ5ah daDׄ5-`4 kK6I.MRU$]`OGxkM_35^֘b ntMs$ )Az΢F]a\L_&X 48 $Ք{4|g\_3%70G״T-KZLYbTsYv{Sz;wob1D5Y!V,*0&g6(  RdzFuŷ}w4Ud-1.bcF>SRQM}n!##uiEE,~DjVǟxN0W(幻O_я8Aϔsgq߼}dVT#WL6j;d\^N)jqdƇa>vWm4ևH $ ̋EdEFՊEk3t2{TxXiCnN'.Q@#PzcםY6m#9oz;:hn,l=|\r8\=RϚw yrя_?|2-\ީҏGmuD1 E;Wj'~Ub`|_}^sYfTRy;ĖW{HˡQZxvQl_T*V&IZ/5C;%߽鹈pv^S:\S]y5=S_Ͼ{r</֣ Ͱzt/@9/Kgf>rrlН93젼_~&,yX_L^P#E0GfyQ-N96ӿ\#?PNrj5VaZ[q- a{ Mh[Z/~Ase D7PNd *t1ff;Y%R9Ս>4 .D}䊨  TH39Ն" N$~E2{mJ&5ֻ>Sy sr[ZJɰOl%@*nbq:Zy>|+S`^" |Ss2L' 2d\I<[ SyX!?Ƈi+[)D-dLn2ՠ;"|Ï닙sK/$|S1E?hyŜiX3, %U{s/Y,εXb:i+\YL :5 $EvF( {%4SnzZK0 KC_iu,/ 0?dyC3 c{/k%ָ,~7[|I]w,Tv&`4i,2ehdnoavw@MI(/#xm+@7Pe7~0GAn$fnf~4yKɨ-8Ʃ,c($Ơu\1\ W%{942,s,y+QvTaZZ5+ 8W/kCfi2C܌ tJlyFvlfwчf>cR!-3س{#Pc=eFyՍt" #%7L egAd-. rw}4m[1flvE]a:~Ǜw5;O6ډ̇*CWZ p`9/ pPFT@{/ls/F0sQ-H '}^S9@ͣ:yQ?%q^ZaIx0g!7coWc&,׼cS"> G#Mʑ!-V[\PN~:O3;,LoxBI5z|:3z\_^&KBpyhx7yǑ[y}P$U©T؅Yz16 n5 *}~2HkU# VeXlU5J@CG8+ػ8gۚH"`ʩm rLy X!}eM'nCyMy6tS,r/W@{%ȢG~q4aUfi% h%6ꎬĴO+1T^l=[ku~+kyǀ`MvoW{>Ys{;<˙p j'P>b7 ys=~tJ)t2;gv;&)!|] vztkLA'8.޶zD͠KC;lzf=< !p͛mfo-޻.fZ ?[ektF=h4\.8Fk:s]>^z %6j& ypjy%eRĈd5FaCiM'; 7_“]S.rK+Y/rArU\!55L "jI1A[%$PyiGߢxPC0!Ruj@~1JAzs[x!%>b./5&ÝJ#al4k l!0 :c~V׶`r.GcFepBME%,.<9+\Fr降4a@ )ÕOu`NGSF`J8.x<mV,!G+TXG*BE?!V(XO(%DO.iٻ8W:;;[Ty+BewBxFm^nl9&Gw;޳?~iCJs,"<19-r(. <`ݡm'0gk뙋Qc*d{{>;*;^EF?;*Po?k9B3;ao3f׻r]?suǪ khk Jsӓ7Fmر%4h򫛅;k^X+VVU n,٘gܤ>D/|?,fI,A <l\R#$(xSL~-bgYgZ"b E47O8;t-;ts!CWcܕunKK9(]eT;4Z^v3p>ˮJcEt b^u}X(Yw-A*fK-㜍iu*Uo5/w)!PpMNeHML}:5]7Vp]K>acb7\H<=w$+{DF* 1 {cʕF99֑{ `Zjivxo|i U,bwuO ~ͩP@k/7?T9cvqa/|m 7o7aj.mG{Bix wNo`PH˗Yz=Y8F[T}VdӪVĜN.ŗywlnnN//o5oYC1ݐJ 2؍UQXC[ܶ4ZQ5nɓusr8 휈om9p)}A`~^Libr39jIтD)8-M1LQPiw;;["Q!Mv!NjB^d  .+iꡃ l.u{J?]&~}"XvȦ`D*B3|m8R&o˕Su^_$O.(/eܚ/:j[ S41SC4\iz+9}/ۛ;6J;F܄T `\4an,$@g!x6qA5\w$ܟXk~yֽ.b'l <&o yL,Ds)oKTl_J^p݅^,@B5⛲OcwՂq㕛Zf ۱PjSޞଠw) ik tv}M"՛NeGw?4<̔S6yŃSwhnϹ/O?6W9OݐM[2xe%q]<&Q}u6O;f&tiz" f̈`2R:4gDgR-B)̮*'"ۑ+6K5}ߗDkuQKRuVӃ Ւj5[k- sM4n)\8Xc@e.ŶimƤ3ZA4xc5IhATu["%.C>+!ak3p߶50t#%_=?͔.Nfߴk]HS{.{4+t6zӡAhYiz/+ 7qzI;եڅCB<[8T]|we)6;b|}`1:NDn"8Z}~eGx)EٓK>dIJd^BNQZ?8hJ'{ ˀ3;9X@O2.k;حSJva!מL3T#Gw/P}Ŭm8rz]$QQiYʁ[G#]MiUVnL6 ],Mm1_f8I4(cq]a'&$d݄l^lٰ? p"YK:gRNQ,lD}T %K>XSbt0nY8k2cM!1MXRsRs>[26<3%<;u>rviwKi%m^N ھI?2B^Fd~^PAI>=Fڈ[. 0 \+),BhLdsR$W Q7Ϲ-n ˠ xy hn;Dz[N0-Fß D}Z5ھ|ȊSœ/kULYt|˪Zlp4Ƒ=84o,I3t]nZljKKg mSb)1"<\,TV\gj2͋ԅ4aζ\ !0F,>sQe}^)|ӥ]ׯ^EMRHQ|~'D1GdbLR;ɇy`Ӵ4nN(|cZb;T퇨҇H8ޝj]zPml*iz0!g`& k ivJȕ A 恕RZ/Ƌ`jX`kU |me5~>9m4m:XpJ"q6#鿢򗻹=x0䶮jLm.`[Xvv_Cm"%):TXxht1xmR3ȢLr* 3C)P"SF/dUIs4]k~Un_LBi@zll.8X9f@, 5:Z0=HZ*^bCCX.4( ,?$'1[/)Ğ!VSbsKYb :xQ2-ʜo*Pɐr9))(+57CYP)fV0[%$?)ha`V t˭eIhG‚ !ˑ~@Bȭ 7<[͢ܫ́כ|n?;yzfWU?9.h+gXd媍^7/`5W7?DDp;DgA// (]m~4}G!b۞=/D =hB8]msmִޞs44g _,-߱@c0,4csa)LޛFmjTh*`˺OIG^^r{ۅ9gLf4nv]+<:ͮCJE(Sd :S @i0Fn|7|XQLMajD)*{ ?sݘnqu(30apMx_$jYb0%b} ً0`+NZԡI Br3/hNL oo\V̓6q {~xXg9[ 'Q%FLI_,o>1\!-H8"`%Ae=5g/?!8d"c 6//ripebeŇ8jȬ3il)c{W ֭ԎM_/fkRm/z H~/~(Kp&&I > ^0lp%ժGW`4JZJJ9Z=CZ*{x>%m*rua؝{F{p48u]0(n%kih} Z,@g^Kr c.B9ο5ᗫIja)߇6 OX# #: w׳kca㬄Ofsܞ_^sd׍-Ϸ?_(:+17Ir69ɡ^o`%D)|ZUŶ/#9MGt0răjqZvB4}~׹Bx]'J{l3G0^O/j)NWV,L'ls: RV&0WKZg΃%yksB fpUNÚD}?/ I\J.uM)D-m). W&6k- \ɝX8LaJ!ޞI!@ ij$ޞi t0&9S/B3 %E#XI œ` 1R`,9]%+T *;HhƨYd9VT 1տ@x?3w Vf}SNc6f^8EIsfǡe y 4KOþב&gFS6. X+=B@ GGFǂ . (LJʥQlLh+hkhY8͵r Ԥ#VJ q*!dZ¸( D2IcşI؉0b0,Q fͧdpì=3v3Aj^#R&H%櫈ť )tO9nJug4ϸEmMŴ`T1;S(;,(%'Y/20ķle۰wӈ;L2} 0q)C0TsKMn JSZyfh; [1eȫu'C>j)"-9I L$ĖSN#fq[==wwp۽W-ʩSY^zSrFգ۹XGURC6Kw eZl#Kj+ 9%V8/a2-5-)KxT\b%.sI%NDIMx#UWz4Fs:ҵ._(x")N yf1Rw{k$hU]HIqqs,,T}{g.;V"DW8꽬7:/`[w73ժUDR`1ʡC9:Ԡ P /J1JC1EUC" :]ccñdLQ{de@(J;*IߠlǍRK^qt}'Ћ/U=QU'٭ qbI*GWNb?LVţvL#FG~E=k`%пcv7@ 29Gee*gRW&H$V:ܪBj\47>qoWtV^_ͮsIܳ6:bb\&bXc]vR^NΓy0~_xSb3ДYFPT(tZ*E6ӵuЖO4?p/0xWw{Q~yQWw6],bQg2mK1wfY{-SUHӫ{T#2B?R)e L5)OCP@..?AzjyW*Ax,[WTxLo1luG\zA)j3֎b#jG 4o*xDFu6vȞ.gxE"RD#0qI" &cPn,훍>A`%̾81zk1^Nq`:d0v " Pk)b쪊c_U S,xu0 阫 u19XK2+(ߺw7''3;& Qup먲y։JrT/[yߜm`_n֛EeWjbzx$XR0gԱBXJUIXtiaH3biL\x3EpubOko+Jhn*мUF5"/0wM +Ib5viRGCVn/\vN[ 4O#l|<zܜT vdo z1Fh<?uȁ A39xrTm|y!!CCm;ˁx!i{σ`ӣ+YB*C:n#$)ܣ._Q{FjQ{7OQ!D z P]|-.)qsLI s z {<S8 7!媲w<@Fc!Wbc;=E%ml~'D$-9|(9ŅzxQBcͽ\C-^A>Y !oouޛX?=j#Dr"Fh_4ѧgGШjCv1Du$DZ\BS-K,5rmPiU9vi7*:‡GC0GW Wbo'evItb:G8[ ). NMT&b0H%Xk J!u ^7p(=b{mPHM>^F$ "9#[8 ؅%xX~՗݅^/28~3`1meXϭ5:KM`R2'VC3a f)As|D+rNauGТc,*ID:Z%Kp%KP9#wa-%Fm897{4JrkiH_yK[DK5(z>2X_Мv)̀"p6skFE1bR\qx1pT)"]D紘|p,{.E|dp. ڲ<4EXRsh9aZdJ,+-fb%DZ) UlS4w;Gآri4\e%HYQ-F>Z? 8 2e)#E5>TD+)1G#,kp!#05F 1 L-Ph"k IZXJ5i/$q+[eQ#\O8*QݾHDIk!YJ:] KOm֌}GʮJZ8 _S8UQ;.^1)Z]PaFJ ۞5-'LJWn*&S䴬*a-ȩC.yit;[[zQRS欃aEJXs,8@ډ8U4dL{c+ lI%DCL!VT[E \c2py 1")}_z '9 )dWBZaksUw1p?]٥@?~;-']'4 Mq l5Ma`ioۄ#!>Z8i0}Uc=|蒃KA{ Էւ'] \sCPXjvAL*i >-sTe)N.`Jtl2I)O{hKTsaꊐ_\:dv3} VϧKi됥e\d$]2jObVZ~h(rm!-oyAN\i!,WG_dO>BNN+k؟\#Ɖ`۪7|b%6)a%`mj}Hk=k__ND-2䒰 QY0ʂV!O[% !n1v+6vUX rխ>PӅmn[uc:G꿬dŏ,RgDB4Jv.sJ9'PnS )$f4"ydteU ӫǷ Q)7*;B~M!x`Nӣ+YT4GSo\j[U^e{i]dLwwZ5Zmalk+/yI5{~tAsD*cVi7G Re?z84FbFm\`U^gIDRSȌRobSf/$c.K@wr`ONfR@`l!BGp\EQM{*yά&JbĨ`> nxԉq\"|6#[tzu-gPVI@i0!l!h Jy@WU*¼FV>EE4ZND/JD8!^v9"M$ (%(lcv >LDA7/iV\ɧ8>dc~Pf=֟˄gBXE!7FjMu6=rAEY(  2.8h-~ho!}a z3 tE\@5 k#m!"FQudОk)Dg u(t$=Hi 聵Fr[wiCud3oC,)?'$ndOGcVo?o?Z7Ͼ΍)O\qmbbOÓJԣ&Ɋd{s*XɊ|~ n$M2a9zokLssƲ6?~#(Yj1ś|G"Rvu{v1|X5e57G4LgKK܈UҠo?XPK8]쿗}6Dqg+k1BQɹ:,~ww}sU͗ŧpT * *_v/~;=s26:kBxչ7ⵈ̪RdVm"hI`gWI/ņgC^g=۵޼hM۳˿.֒$܀Xvx=\{Z7íQl= ǡ?p,J4fa7 gp1۫1у3c$VXdtL[DCot|m=MZM:P]3؞N$F'H^!ҮagD}J$0izAO=-7l!wcm6mH+MkYjѶݎRrysފ+u_g/(vzMf%olA :lSS.;z!{Ƥr2)Jt"ՂT#+>tg"7LFd+!is9h~nZ" PM;\opYoE4;s%0@]KD|g_~5+( Ǎ)0"*sؤ+$*>īXu ]jc-h`yA  :[^{C'\SmeyBqAZlYy`/U1תP3q@윻E!Z(,},Q4s+ע'_˾xfۛߞ/ `/~[c8mC[&Io%;^ͨSڗmF1 [Oۯs4`n?kЌw&!h h7R#qRoz-FBU-7׹?!C] {'-VUƽT(H$Qjgn*XsbT\ $1P}˱⹜xec9[#5~2A@uٻuyj Pk[.!xjɯ0>A Rx@BH8 uZ+HVOYR)B*|YNkJ T@ vuZPl{{k̕V)R1fd'c2HzB|/|lA#pN>lr$F,Fm<q1Jq+@D%zB̀OB"XDfPFf{H$h4OF>Τ;(Gf:oSZj@kN$I4^T[*(RN As%Z -oP SDh\PΫ%cX ;72DV$N ZoWR Oj9۽Y尖_Eť;sAlڭLl?ĩ[l{&lSw /m=漙g~nfw8O.['i޿ťr9NN|oo$3_Kc{>S0|xAtR_sJ?;zhok YduD\Q"bP4HŸIpoHAV١F=gKo6_4˯w( >̟d茵^xIkg65DPI%G.@2ZuE\}IA 8 * zB Єt,7QuyP'dKe P4nH61,j >E` m*ʘ1Aр.@Vߔiwo kwTRBQE:&#cN $ n4k'^Tiqj 7SO&0}Biz3ś_~{qSL+4`vuvHºdGDpBRڒC^Rm.DRU Yv|(-OpWF()Cdh,JV׭`.u[V1]^GW3EkH+m+lATr3Zc C'zJv' [Ô0v-PRrZdՌ܆ƽ Pdϥq79f${E~a4{7.4UR]kC3h6;ЫAFp_t^ÿ 5U&)-J ̕r֍*0yr]j|RkZK3tUd٬CJgTCrqr&0MF^_|S7έg3@{/W VGK Yq2N_+%~"ɮ6" @URB$/  r]n#G^YRyЃn/^7Vj%QKRv 7HE(f]IښiV*DTdDf<`h3up8(i 偅Ai%ޕeu'EGΪ$:3ݥV;I z-#"U$Jm8pUiCSu7.z>|n1}7Z {An&^^=wj!U{?7#kx {[֟z[(\YAWҮ ֭j[7fX EQk_z[-ʨ$ZosW\ o]hٹ_z&+U1͊q|w ==fOjGj;,᭸D!A0knd{=-P!sQk6-qxDw 7ZpcAD%!ǃpZ( 듇#/ٷo!!KK>#+oh0qdÛ8-fio"M\7El2H' "3 ṁ#xbxȍsfB0؁#U~6~{AW"@ഞfq3OEl)\ķ~xccVHp!:8fthݷEp''1gH+Ctb؞wx .Q?Ų\r; jDyk說" 7e[^jrV E0&}9>%s\1xV$fBE6a17/r 1fI+.ng2xdt~r<(xĔ#-qB+vRfv2_̾RsGi &'%N`ϝrzĆ^X`؊ A#zbd u:+)9릷@M &(-PٌcdyLvZĈD cLvZo%:Wٕ 9v)&(FB 4ڝВ1qBDbJ5fMo)D3wN`!ĭ 薋iq!~z!n;0w1nCryx.\ qo=N"iq?}OqzZ>q=Y-w`$$:.B\'ajW;ø 42{.K;y|G F8%G Н@MGA7=Jǘ}Kԓ[ꑴhYm>ש&!L##|A#e4H^͆B?m Jn ޚZ&¸d2ah#"R3FF/mĂ|Ԥ0֤mc-A6mA ]SRDPQsǪ!TIRMk9cy}n\aK6z̷Z72`Xm]y!)˙0L%|⊊Wis.a*SZ/qBrmC$x*r#=#l.S<#7-+Q(CX@0_2jx X0&RZ1ƎYkj_MKm_佫} j!ho /Vp|,OY\CZPQ$֯X;AUl 6S֓\ӽ:Q!YB1jVTWIzYbI9Ccp<㓁ZǣE#}H=83omk`$młS\ULƓ%2k\]>[~ ט{N8M>||+V\i*rv^S\Gn7KGZJ$4c)L(J)ԛT5:[m;Z`>ϘǒnVju7`Ѝ* CUk cIJaF!QZ?@!d(ĉ^T3HX@N+?Hf:QZaIl_cXyazڂ?#>t$y) gc<ר%%7Y^=F5u4 1zi4VDxt-3eWM8UK1Qyц Znn\:"+d*J'Zq^ZqQfГepG5;W0QR)~spmq)%ol{ٽV .Ql׹WLHZA{7E}) ;Tuj WkbwWLlGzua.hxaePu)RIuZ*l --ujRҘx5ִl-a9+;w%D#b}7}~[&;L 냈.g ڇhj[ZbP>/hD!{᧏R|rp_A]UP$TEWmTIKU (~*JR6oR$ѩnlP͎9_0D6U dLpJ)DS)P><l v *'5WlR!qfT.}Lg_6qV N1"Su7,i 2"FD872XLuV(/qPLZh:~ti8cbѤC{;ꪅ.G%-b[ħT@\BDsxӝycQ'4s>n.œ,W5bw남Ozhi"+iUfK\sM+_/frxwޏ0]#EG2!*s<)CT"NIaAϋwU:Y⛞$J“E"1ۧ~8=8}ER N&_:{+<ߩu(l3,Ҍ cL#4gY9*?Ӟ}#[FI EQUOg9cU~6s'~$3k,<3 'xU}a~(->C,>eVKT+Yya'" &1??WnuaAԬ{4O^}m&YAS6;t1xI׃O{m)w7x8x9F7{IJBS;ں#H}h!"s Fc+y;c<Ja,yH`|ZBKp -kb)3'a!ĉ,`VGR#Hb 8$8cı>/<:HZӜor*%V,e0SFvJ$lB2@G;F'`QriiTΌyEO>\@H|,H@TlRvQԂJ"y>Pu&.p=&拣^R BXK׋fu"XKWLԖk~saBco"<xy|DٻF$WfJ!@Ov{0wmޝyQKH67:)njXEfddT$V$ ⏿Ϗ=x{ĔZO]8j- ^Ny *Es{Fϓ5: _|VDnrߚ=.&?*iuLDC,ȴTw< 7BK5 6wClc`Bz`QYg.x繰?hb.TSG2 ͻWm7gO{# Hm9s`J 6|5G83yZw5 7=em*M-^#- Kxfu 1HRRCX/ld⿊;m6S8[2@C]JIp3NEkaYړc@/ВPt%LLB?"jSG6Ȇçޒc>Ǫ(GD^NR 9qͺx[ٗ!ijXoU;Wf1vf5CZsN쮪BNTxP.dJBh\=u'*m+>$H~^mC5{ oML8!=ts?įD_NJ8 >X 2y4A,rbxƦܝ v\48 [)Fx)5'0_vT ̢UJ,y8TJFV,.1@tp+Y껥{& 2{Kna=nȠ Ce*F0t\X^GXSyk®ºB5Hn;uLSh`2[x _ne 9Ka QiIr%2DYsd9\XMFt&!> KɇF~BiR>AyZS{<#T;DuADHK9shj9dZ3raBspͩƞj%o3iL3ceV!gVsTpRM XUfo%Ah19^]m<5&$u>DL˴km'hT)ê,*wnFwjjxtÝ՝+pMok#Ԛ[u+_k"t"1t_qN-@sDJ0T?ݻ'U0Dj<էgN#OӀ$Fk5RVb &dnUxPbJkfƊWVUVkF0!DPvHH5(L}VJ3V ^: ܷO߳なVkUҒ VZQ%5G,łDl<\yib .0xPˆLO kAQ+/YBsu ^må֌Zmn `%tAnsi`).F]M؞t([mE%pu *tz RD6OF(%U쵉.;MVM,)&RMJ,#M*U8Ƥf۳RÊz Jw%[cҋ$}0UAzr~Mڠj_ziXK1&D_c`E 5!Q*y6QZ\kʗ!To VR6Ք?K&f"F"aǫڡRcY69ɦXʖ%};\J%y0kUF)=|JʜwG3{F)qӣK?f.[)Ycww~ơw ]6J~=NQwUY~~Or:2N3noTϴU3Ff#م*g1fȉOZsz5U%M֭.9SU[@@eѶu&9Ӻ51CN|֚S ɪuӈ=ZLcTnJ:-5nۺoiݚ!'>E)k `b3uQźsJ93V?1Ӻ51CN|40;DPs^ GG'_+2ư!%QLk~#鎨 ^Vh/)Z[7].=[Ne-#~Q=[~ٮZMNa4' +,q/C d{7w8ֈ!r36G[ DKWLQ%HaQSXs&ևOvDʎ. 8zxz _8JdKg־.=iػث:/)gBNؠrJN$*{Y(U[b2(HjMő F8iy:wV8Q 8-\dH$6k:DQ$ppy* e^ b0j -B Ǝ -,R(`G*G$qcaZ2-yg$?*OCafhJ4K]D G5Bh#$H1XDK^0UG),"E3`1,JFxq(4FR+ҵvydHR&ֲ7k) nF~-1cH~-_k;.!DK)(XbKj ԥ`ϱz~_rI(* \Rɔ&ȫro qZ3۬b|)j2oʤxoʼne=Ң!p (&CWRҙ.;!6KH+vTWFEUm5{7:΋c+QPLх ¥wh}\fQҲzPTp?;7$UcXWIk+bא0,z K B']^c6eŔ t~nJzHSDhjFpP)q=|0&8-%iyʩ&F:,L?}-O1O1w=vUΫzM\įqR->VG8vwc?~*mY+K#VnܦbǮBB=K v>:c wM,EVhbIFV#d,s2`f\N#ȸb̦01P#("k|Fڄ:ͽ| V)"hZ8uvkR_SY1tÖg;X7]sr'!l?;HU ۩f86M5n] 0k94Zp<^*Ao1>w@HAX1N+ W3q79OFOpFS. ݺ†$D#t$ґCH:{}+J`Ԩ7p|0g(%e*! H(-3)Wϫ'0[Z51ojt5qb&C \3l~d:hrJz;ڃ>S/%LM,}j>ԮҞٕ\닫,_Wg2Tysu2@?Q#G3$0Ͻi1w(4*huXۧ4o2|9l9y|VdLTkmGH7Gip&:d&5T7|N5X '0nwAJVG wpap5LqSdێx0|%f8#*ČiIR*= ZK L.ō DXDq.jY Vn˩ALs1! :Ec}pːVDI )aQ19ip3صeۊǺfsaitHEZQcc>W`T=Vpt:q&KCÇC*ܽw<"s1[4?=$ϯY$NWO^nqMXBQ_>(]0⫧0OOp "@(S1 ? ZƓ&/S\L)Ό'ߎo0{> I4Dp߈򒁋1X9웾yZ543gOY$SAym1{jWcY*n= j<ǜ\rzG}%R9I2א0Ly+qf$r F)~,he5w/ =˫9߿w2#.mu: Fe #& $Xu3"_.9i3HN%lv*E R(H5p/D"(X~,(C#g<%iZzզSd#ꥇX* nwBA11qH>4V*AE:T*0-Lf4sսA5>PAP|BMM",ix} *DMu9gKP~ :ֵW.V&p8/YAQiGU2'15*9(`wh$^s^݉2`N(2# }!_|#':Q fAA([TLyeaI&r*<@qGdɛ3%Ju25EȌdz}(ENVn ښB?T%|@/MGG+(N0JD5RaJc~n[pC{ۛ+u`X2SDB$=f gy Eo5U 9r{qs1 z0=jh 񅵐-/~kѯjcg/.P (y/֧i~iŋv1eAsj^m<mgqk۞å`F ;#ҫpTyЦrA<]BÞ4%:.}3`D~{vM(7L!Pok`ȼ>,1E(IJbI>pi92.%RWB[/2rZǏS|TsCָFdΜ! I"U$zH2IR**i ',mLQTKH{FA"yi2RE1 e57isg,ୀ^:+|Jzg/l9Egc̛5'aD|J:l\\(-' \dif uC8!CqN&ZDnډDrP\H%Ժ]0F!)0]ZZAr9jĂ#3acH44w6ڃp4DJg)WwEpm2XWU17(z@E $i;V{EtE-Juh7w+CkP;SLy?֫39BFAݩ fJhXܻ.O_ЙT nt$+kAxhHmόVmlс1Z=,4lf=hrre=,an٠BjAD߬!.\E ~2k}A?^*%&3'+[eef 93^z{ \|8tqr;ms/.Q(q(v=ZjQ6G4'VjV[2ZH-ZK^keerƌURLQFv3I ZZT@S73GTP  &rS&gF(i 29ZP"6*A4\2lM4h$o9GСJ P:12=)Lq2^Ocu0p4MhO*8-Ơ{Oq&dq. Ev5o+?~??70<$,9ZICYOhr}iaRYTr뺓Y_M˝usF!I!M;>i+*n\8{ :ʝV3Xp{G'% 4CK&.+z9 ٔ&ˊ MeޥdPJ-P Ge-"U2}\we#j7I"c ҀT ={BZZJJ!>(}>oAӔ]I ;9a0$ ? FdJH gE UqKw)jF/ɆN0I(CT6JblYvł^5"MO1L1l+%\vK.5j^RxgW?A:Zb/P5HͩYz a]m"Qn|o_I-+8~FpOi~CIiDRQ,lzȾ$3"({9/y" DoOIH; Gt7?.A- d#e1xS՞Y0"=Y@u]\DtÕVOsGǨ-AF9:3qkֳa 10s=,irkLO m4=}v wQq&q Rӈw1䩜Un-%oVz:[#AJ'ɤBSTǘ,XIY,wN* Fj.\w ω&-~KUѴ̸C%V7w!^R@0ȋQo}4SU{Ue# B$JbE5"agdf#v~'o1ތ}|vt Ŭ7CɬfU6kYsU4k#Yz7{Mb2Et2\DVXFFNY4\3a7(c, H3Os&#|p7_aPpۉ r ~D & |6\̮zuqh=y}~]>P@䞲 %Rd]- \|eޛ6x?{? /n_vv}y}x{3voSoZi;҄6j x4xx@14&95A\kbSnϒ"e@M j>ٵ?uۦZW )5'ᆑ( sbD.&RB'+@Bƍ4p([\A'ɼѐhQ-0;B[ @FW\hGcT$>:Z џwZFjr^:9( hhLr+D[A T\ _//Jˋu7?^+;W>B)Ꙋ-.( [}Vd8~2bVBNK|v2Gb@ 6LvC:4NKc?[ CIuJiŻW4mM@l8MԠ{nK*^mJPy&qD$CLdqP6jZhLu\}6"W;xV:41х!& -"׫(iH.ۅGZPi7@kŸ6;5+O'Iċ] ~nR2bT_i ->Hey܌QGَgYMK1c=oeE`-`^c}vaUhˬƫSaJnM4 zré`]G  BH ZLxy=V|HZXWq̐ _Hj41*j٦|Ǩ>z԰nK6o q 1.C;x}c[ ؋ʛ&Ǭ`"$UtV0#ڇw'a=(1DGtSjux~3|?QEhxL *,Fn~7ՌiZnru(IsDd0 s'}?}`g_ۀc*M{:~6({dU."_%hpZd\D1t Lb$-ύg,"K_Prc4&f0H#=uɃQODP&1["r$9S,"Ag +j};Th*@#aX3wBIlIߦ/B~kv9_kmfnGy޵Pł]GħUu={q^ m =~t vJl&Rߛlوp ȖANJZ{lhm)G̼i=M̼APS0. Tv'X`f:weqIzt+2#OzȚ;kZϼ!)KkMu2Fa/"##3y"q ޱe%r^KkJ !Yp ]:N =3p0jr4/ z5kˮƎ Jԕ4i_ \ἧ8Ec * ;b%njl1jc6}4ZP7fG-,KA^jY(X3n7ꀑ~2 iQȿ[]}y'c|rz!2o,7E;$ !9hfzgX>(8Ψp@T߿˷=-f Ɉc0W!К*aBĢp}~aGbK0{2w+]G+󷛉[NU꽨n;M1<ɩɩrs]:cu)kp. (q5 Ⱥcv~ӧԛ5;okѸ { r>`2=G~z8/Z#Uqj6*vOah5].w5z}[wd[V ]]n "rp c㱬)hRF"ꉤ՘9|5JֶAvw;4B͸}o涼naݣ:5H=6EABΠQO܃c5GˢЩVȁta*hZrMED"xVw.[=8~}\4pzp~X̼j+lT^}}={J ťj5<#&AxZkHH6` ; uJt]ɩ ˛9Pju= /"]_/n5gvGŻ ->\`oRd+2Ee.8R \̀6?syAv;[քҖҢfl}rFqn^olXsv)i{.()"yiG!xR6QJ''.y&ņ XZp˨Y.I3bjH&Jg${,Y1 )ʊGNZ(&Y7)t#KˁYh:kGZ'A>y`~WAѥ+hwJƍ_u7y@(&FjkW=BsQz鹢}̹̆$̢9bgCL|'ȄyvCiŵY7UΛmBw\8Qv: w?Kau],|.n.Xn_r6wuE?sG͆ٕS 0w?K$".?Ȅ* ]Mucuab}eby@uN9=̲2 o͈4.NŊE'FwyВ=t|n<ؑ$ބW76RƆ Od&qvD7"ndycsʛM]kx[ݱ;YTa}M5ٸË7x-kOuJ?ڤ>I77_\MJhC{}?W?/b {sf{g vɯfJ]߱hzUOQ.&T 4fk-*FuEFMikd9^/˖^Ր]g%JE~H+W%{qmɔRaG5Sdz?f 6+8[>wE)o/H|u"e+?x`f֫BMc.[S2 :&2&t2w=NHRFWgk}+@lb:a-i[cs6ƫنRk#i̠5gHnt#^hv7p@cT b@v/)|x DrΓ=х~cv11˂YLdcg%[NEdē4`jkz8 | x"m|tigqσ!У'5L-: W6t#%G|,O!j4LɁC .T2pM5̳9&[i4>šS<`D+3dwJ+InVlR˕I Y%u%s7"xdJNSzbr Bf舛g[WJ7P=pzPkPhC5p ONGZ.&o?] E! ôQ"L,@R4R_E5dC+e_%=DpUfc@}L|U&}f^-m~go;'S@X; E[ 4V-/㞇jI!YuBc ebjV~4KwX#j86 5jz.y?~?W}S VJK)bkྱd-z,z|(G`v0i:שxu|(iǒ>O"ϩ {,|Truj OF_-88R1rT,4Z*>-*:&zN 5t-[a"=UtTPp?A1v(hQТl p'jN#*s\ ſ>b &_xyyq__x2_߽=y<΁@lFo (Y0ޢF[噕LRȆgUI !1$6h,y;D\Ƕ8f3fŲ98D1T SVItJ_R#*Gx2Z@xI:9Xq&8 5\˹5JX pgZ)5Һ ,VZy{.Y.Z1Gn]Ifku3d _-y< ,Z9l,~<(N;93Q >ZSv"Ud-ClYJe.dLvDtiLwBNE0iѕ~8ut;r2S!'yʎ*X3XatzBϮ$׌q<]sTg&dYT$ kS|g[\-~ǃ`%n c̓1w`bf<&wk 'c'=SNgPϤ&c1/PQ xR)_ jWC% E F1'R~ӯK!Nk p&8r#*sP}_##5<5 :be ~1tG![gɢA'PHc$mw},2 "cd5<#A:1>p,:e:`$ 9#8nd¡fYYa$h4XJ`e CDZ`G.0\v[hgT!E+ `@F"6(40fP-b5p±tfuI؎0s(Ր=SZd vb5T y>fWӊ9Z&eOt.cŽKSHD)bE`[^z.?&wkk[^EEkb+is.RË#VXҌhӳ 쳤*SZi1 1F}%7{F,?-[||+[U67n.w7t[\3-q~%9gzZ~YEOڧYe}z7dˮA|Gٝ/K,&nn i DĔLiY%s2QǼ@zlA;vLd1s:gof mEЉ=Qy;퐖\VySοȽ;& 6SP'*/L[1hD_υx&fpm&2p%kRP`1.a&x3mLd-l9h.(ggt*$SJJ!]Ȋ1FCV|Q)-\Y;UU8=߻ŵ~Vhqрp66DyBd2]=1LR.[e.YR!qZtD˲ۮh6:;,_fnγ09m*Zz;02q$m01ؠHp,S%Xv-BCM{І7LVh ^O 55oӞ=Begsy\O)h|J!CWu,ZufvgzV&ilG N{c#V U𑯪>b;BЛqM iYbʙN%`LRZO(L>fD ʼnfVR{-x1XE?rG~D%O~ f>sq鷏aGZ/OEV?d/b /J@*ؤkic"M>kX0ֱY 稩糾:UO-ӑә!yÞ2pg}fZ:=\gqB7Ok~pv #?{B= '{.CFx y_I-}. ~B2ݡxҩ֨Roc&>;dLy_.f1)kz9a<xx^$n:pMc_= |S/{c7Ӂ. L6wQHz=3/>;LAHi"Lq0R4El8kMyUj5uBi\ # V/J%sSXpVƣZ]QSi Ƨf,'U0s*F-8VyTTLQBS {k(IB|`4689$ӥnJjԞP`&"/ rZe#A G`T2ClG&Z@Ba7`^$>3=feM4 -#f a##XɄۊ">˫yA XbqA:ApBĕZ2GFJX=ėwy?{W [fud'@*.@lш,!`ƌ!';j!`n~eIrZʂ>a߀23*"{bh !atD-)q >t%BKRF"Ux׶f4;<5 X"*Q*Vb,H ֜@"d`DZFb-/S2[C?Ɠ``7hB%U`C( i.(Taf$xSzGcyU,ԀIV`Ac5oWrsod2pbq)aLq7͈ Ҩ@-25G"SE.1[l$05O8cȔvb L0ޚuDyxFA3_>C%{:>QWU/ڃDX"1g!20'[4bo Y[_ތߌ~ٞI@p;Jhldjk}s>wuݧ;⯽LaU0/_jk9@Ao#ie.Cmp7Q/rQr NmtKȸ]~cJc+ }^D{+)>^GpwF7)q7f~s. m;hty;%*y׶|T{d筳~ǘ4ѬH1N:šu|76lk$?Ε@Gvbc;Fx}N_۷ejy2"G04b56fwhn m;MY,J=Jo,Pn#1[<@,&m5Sy.L _IO1\*h ̭Ʉ+z$ }z$=Ĩeo$_8j0 )-׮EQf|8LT6W\i50@K8KIСHݠIWRյ19 %f7n~v=uS Wz~$ʛ_n2|]> K!T1cŮj5ÙR ȜŠJWBI|,Ce >:kt$Dc$B0=rr LlW89Zw9Y7%T}9λe),~..n'F-[]Q>^& 7?ŗf^9]FE8?Pw*%+ TJ*DTV0N5>͐ Ww9 px`At`bM~<~u18ģh] a*WZ/~f"u9^$/>L;sxZ*+Kʮ\>gB[#K^ ލo`Nn^uт<\z'X$!\Dd 6vwD햊AQG/j,±lM)n9$䙋hLIvӺ-(Iv;1 UnI]-τj.GJg(.e z"-p]?8 %hlQ(ꄝ\Rަxy Rw,.BmNԸQŅst AvF`_[)vwVz(^mJx*{lA]P?{Pk}t^`A~ Y_~j,gP,yP=ʰ5Q=akcBrC#gbdPv}A#sag?mڠ'G#I՗bhK!DiYx-p?Y.CSkOFNVjƸ*<+1UҖNJ׈ï 5 FhJG0 VԄQժg1 -I,ЄK -6H rK^{!7G;BhO@eM#K]m9r+\5|/ p!"daݤd<4wCJ%uKlZ=#O]|HɪzBa%?UcP~(;r*0pmrJ5$VM􈻓NCO"h2!%G񩠨4G* Fw求QA$R7G(qB \=AdeݽRFȋb DFcUCKea /b̨;.ͧ/?^-hi~|Imk`#TL1ޓҔJ>~E͑F'+"/ uT)mb?Fl_ O\͒S[>4olS>N̚j Y_tZM..]j1jѰ'_u3RIG!M&!Ù[! Eg1.mu3giB%W(~Q@9֠o0J┠7L$r!5], Ѷ ąWr47 IU*ҨAcRefN%[Cj1 L[Z[MuAcDqtPysh ˋ6& L yZG\.5RXw%b>w-FBbXAf&Xw ]ɔ2~>{ûum_?9Pݶ+]](g]Cm 4ȯܕvZ YfZJ]x O2[8{價_6*_I??YVNDs!"†DYi&sj(*VR%LXЌfֺs"X%iVchD[,EE=j5%V_j^~(!.3TWŸZ/Zs[+{@KV; #*EWy^_IOоx[갽4.@}ٙ#RVbrU@ HN`az&x]62RViB$ˢ|{4.*8CĬÈ٧C0 o!2zr٥"]Q^l$P|+H 0:a18a~H\Ҽ<"DLSgh@f5؛,)犯YP7p񝤤;ҬdZj@ޮ.8!8[aRߠN8(M$GsS-_R]TMuU+eArϕ:^aG/p]jz+NoL;Zix!rp)h8+$@ʹ1{sJ` ޙ"4gJ QSU|BK<&8&&BÛwy0g^9}=,?f19@uN߶6[懎vIX|Aa6}5t+& .|WʯuS.Rңz(^b5PU\~w&瓧ϓ珫ZypK6mlxsgnBFww_$%j&Bs*v[KsZ&G|F( c%s\>'*k9Z :.Tkx)I JXgL>;$ԢT)0IAOۆSgPԠVp$>m cd);fmXQ[Qr2b)<)X1ID:1GmAa;p evPF[fvX!!72V2Y `xLZZVgdΕᢈfg ìehXEO/)E$*&U5KDޗu9ޗ[?!$2锾Sr;-Q"LJ<V`\(NUB%ӇL9%%V*,) ac?jriNe IVbnm>J3{M)$rɛIΖgh\]h认]٨0j+? 6,oWGO[U_f i1ɇ0ߺIxͬb-:%yi{0`2 ³d6!#*|P0ϲǸE Br+(vNKhc h`Uo]eճ(8(}C dX=4u LJkP>wB(1;sːG |ڎbYů7ZF*<ɛع1la !D4x5ToG_ZSlaݛN.eߑQǤFM@fC]A=W̩q-M"|#&|Øᦼg}@%9F>2}|GVk1^U0k-ROo)R6[ SbtŖA ݷF/kuSz[= ֏^K^AIR"o*ֻԐgh 8z}jl+;[;<6vg6F(5F7a}e8x:Á%#@\\e JeEdndc "l|%oqZxx\,_Z'w52x!GHWpMƀ @\ɕ@L $iQx;.!8:69DMq\$"k:éHc^&B (I aeQY ЭhRe^'ژqp,ܔY[M٢߃z/M4Nb=:tW!󝌑Wl\*N_?s^>4u/=DPEzCASN6Q$aQ)44J(k&8m ?B v/_:Ԓnݚ 6ڤ'(O vaZ#y$qno(B,!K s?[F].uu&OFlO?S_2|(Svb̃lniK¿%t[dQܧ O찎_߹-V)+^y}ZAk2T&aڥ*ZsPEK1꣢6^J "BI k@THҀK9RJ+D ;Já+RThlcxiɼz:V[G~YEvh|Wukx::`:'C1ǡ\L {/\?SEx秘ٿ>/tnu=OG/\xoʓsL ) C MIuRco7c+^/J:n$LH2s1T>)qRLinQl\÷w/,IBDXѧMqF|VQ,IJy"fҌ05pH 1  $~*a+yIW$p5 ^!;V%\ R)AL+?MEzV'X_ l${3sh/N{*4Zɞ~aGMPD-V%򯟫^*/] t' /yt CFEWWab)r_w{ 3Mk_XJ7>#]TPZ qB,Qbq(UML`  c]_11-pU> ;]rYX:4M>?[hQElQNzHW@ $/rV8R 2^H D^B1w0\%vLm_vnP40>#g[-柖wN!=ȬhV3(2Y(%ŦQ.t+[&t4i鄧h:SZlyD z ]wGيU.} ػȪhWg}=ʊ?jY o 9ǜ8kY) …!`FzBcY*;ҵ8CւId|{oJ1BA-r#o[h3΄ );ffU;CNq8/"ϻmkOfKZ.i{w99> P0?~|D_MaK8vȯ#/YsO>H-x-۰J/d" hf3T^S?*^mÁZו=ca44Nk-asQ[epF82GàphKfݕ_V=6fvei{Bi ' -&֠"}P#2$Folw -thODS,΁/: 3  ԞjTc\yXgE.(KQRasrʔtxc4YA JwHrwX;Y~N;\d:ĞHrE%n[-`2ZŧdyO:m9tӴN Rw| x ^ac?po8}Y$.?JRUl]e@<!$DEoSJ[鹲$?y#ˀҵ4>ˆ?8ԁdgu`+μAx@?*fxpp{'E!J)gD)Lx wN[Lq%1<(gnDZII$mEЛ[7ta,]B>j_dNiCzCp{e2T3 yO(|lʙϋl,1fAg;)A׹@y\rO}z(a]dՃ+p]%( mS]yx-:&m:Js#*v3 Aӯ*4S$7[LkBO ) $סCySY9]xRo0b1m>΅o; 2:֠>͘zݥ<$dQqŃ0wP a 2 'xR%q$gjg+L2`쯻 S̞|),啮A **zK=/ R>nH ^R 6it+)bF{GdKÙŹ9S:ɸ%Vp !_*N;K_:GY\/.CʿG6.O.RFh'2LmIů+ny%ʰZމR8 1&EЀbn[uؙr.E\_}jBgr>LVrj*T Q?ǰ>{ng|+!A| yeu{kY}|?翵@3ƭ cQmȀ&Y} ĝ3hi)`SZ̞"ϓjIRIA\S$Mhd>.>~w{Jow90%%=L2 $5, vžYf}C8xsarwxx<W끧EŝfyJ"1$KG7:o?.̡b_?@]u{uֈk=ޏ/ɱEzՋyِ}. $ Z~i,U#ÕV{YԻt髛[#X͊i`]w{ur F+'8_PC8bd ID8HN&s4r,Aʺ oq6 n G05%tZ/>n\dbR拵i|w$Ays !W`o֯rsF[> O6]p)q֤O\ɘ&.7}qMꂏT6!5JL"-4K\LS 6R%$T6>gDأ?f 9t#F3"g Ik#{wM@Vo42}mhO%O:H44yHej|t pJxF3b 9DF3fqqEH%d|ZB:f-QTc~hSA}5U(WS%p`e̹is`'&K)Π7SҐǬ-0gK,m4<<´2\K]f ^bk~]Y7DdO\MpN_boD>+\u6a2pe J.E?vr^9ʣ3 Ri`ӑkS^1#S`=yAf23*=7ONY: ׋W_QrRLvC6Z˯~qu,SJ"fW_=z#ܧǸ{@ }U9N=ox9bDFY_mJe=rܠc] izD)7Lj-m~΍1折daC3F@)X#,1fȍX6Ӭy׺#AzPtxv_ZhT&Iuy ܡ50T]2q2.muP7)W&JI n~\M QB:`eu^;@pڰOt 17o:t}bprj#0 gnKZ^pp1$VFO DS`U**d:iQ[Yd4w 9A`YVd0yQQF9WSu |j5%*j<,UjU]Qcx2%I^ э`_Op6.hQF\',;j; 4QV\C>"iZ`0Zz,&=^j)dfutamx 7ƱAf_:>J!Ieumq?_U.oSW['q6{s1_fOBߤǟ"OM7tl׿VTE3V]8_ t:N %{|Y-ݕU䣮qs[Rc׃g7;t0Vz5U:/Dl w 8VA>vBu*`[9zօ|&zMQ"sFtUbXAtIqXowĂsY QE >hB_^7ՂbʹxPFQkT#IzW_"КF(3*-BȮK: 3 +ZN~*SJܬ.8~¬Ts._҂ѧ~yg:5Plh* -'%y0x`3 Á]Z1냣h^ [ 1~/l9Ad5=!v_ah=A~PWdXn胾[ X^!7tyt:JSzfs{70~R^ےn)O/:QoS.жp zR>di"$Í?N]/bX42şL]2wɴ%zW7nꕵ#"xƭKm8 +Rx FW1850ѓx~X}EM̎׹]MYRd&Y\ PM r$ˆ*YWµ]0ŹD}! tf^\΄:7k1ʪ@`H\9fJm=s2 PUnرVqvv9BeGceh%6'z.+\**|*PpR₩J/DgqOs.#T7ur{EB'a/@RfN1  ht B8)WxW$i/ }sBr:hLx엉L9%~|BεA?I>Q {}S-?&oohn~,>@qɻBzY8k+D;1ы2Ҧ"ߖUf- 1)dVh?,?=!DP#%߲M XU}Lt{/ׁw'l]'7{,Ng([b"6L'J= ,ΉqIN1$ZY;w8/v9.V0 B`WDv3 9Q`9IwlӵZt#cfn\#_k<lИuqCӣ;"jn-)VqA Jpl/lի7btVB:zkXM\o uə}v$? CfhQaG$ ZstZnh`"UL*]T.X!FHD%մC)V} K(#uqc[Y3uoh Qanv&*U.eRN) Qr4 Lq£cAG5/Zhz1NEσ#e]A;B=Q`tHdp4@ >hB5JG#/5F)vGj)GAUA Q3e[I:5{DY]Xn%6]L6[)9S6|ɤSa[݊ua!_ٔ8wV7jhp. (OEhՇp.h6s6Ƚ{&8_\WVqA8uh yA)Ko % ?+A;UBSuUseT9wh*,eV>*ӜFZTZY 2Q@IVز )wSvGQcY0%3 W.JL굮XB#HЊ7bfssI?Ҋypm7~?owd7AJA<=*}Q㉿zE-H)B)%أ`QCzIRD*K>RS6L8eu3o'AޔAڒoK\ cCrDuTQɄ'ntEbA&EPAT1E 9SsJQܜ_t"-GA 8e1LcR4`A3)w'Zu'SF Dȴӌ9MR4ekJ*y.tdįN (xϬůC5Zs<&#=qK\LG*?{WǍF&Y"y>X} +ۊfHjMOKfw("_EXeAS..qi{֑\Nң޺gxG/O5yI)!xjɫR@B( 7h>%hwڽyvtO>u/ sSt&vP{ʞڜ Eo@ieIx-** 8%=JI lG JEbf4Z&Jz4?QkVNj ?33P)ی j߻ {ԞdFqEh̚(? ,Z*Y ʛF*ZaUDray^}` j MAk^cAJ^gx..T)+L o<Ɛl! Ѝ;ާxDJA'K+q&Av='ghF0o֑vj"7pc7hV?:K 0#vmoWK4H}_vW/7q1AW; EXS\+Wze+W>ED"u,`>mS^U%x @PԞX*:R7/ E<&VLUx%5Raa=@nnOJ hWT|>:vY}jU?1 7& cfLBSϸI4V +I)Rjj.d똮ĝ .&qg l= '}C)x%u|bFp%प$T }E`ɩy+i\y* 1Cm=~DF`W:?#UOi9mR:>/ ͫ49g`20zo['~ <}2M qFZ-877qd CКA?K )TD+ TiC*q_:XE}rfE),䅛hM7KR[z=;1-C)!\b1&"&Q J]8TSۯw;i4lGШs=6~H&P4'?tO"l- +&(cPzźisn77lB^)N}pq n4>@+,4'fwŏ7GΗ67-[ &@a91130-zv>'8V V}~8F1IYӼ3Ka頛T6-»þl7*nqDŴN&P$yEZqP+GQz0S tllm&/2C&qBh Kn@7lB\}mi&>n3/VR}JכaߪqytsJvD+!YzhwVr >q(W 9ۼڗhRYNsVib:&E {A^d 2ꨪI05vPke4_]mr` jb_bSs=B9E#G*y•P\BU+ҩ& WOttCΰAM\9,<Ǟ>PWJ>_ 8Ξ0>bN XiH Zҙ ˌ<9 zԜ~+K=}<8`&.B!#_ Ip6 Qżn" ^uBfIۃ6І?/f/Y͙[&yB *^Sӂ@tfrF?J;`Z;#[3=e _s۽-iXT6 R=0_^lܣw$fF3@lYRLs$fv-77#sdZoRs`;7q?o6wO(bhdCdA0VK%>O *`U6wdiGEܑ]#hpP={S"93S<ۀh- u”'TH\a\%"|C6IcE6׵&bŦ T|P'),y60* +dE䰎Q4tH*6F)mU9V<L9/ZAU"V>2*%S, M7(khN&US'z/9e {uA*g4w Ո)!X;(Q'P<ԨUVBm>"%1?5~pbygw+umqtF1bŜJ=ٶ*ZSR4 ѻ IsT=##S~y[!:\z5gXZrr%k)w;_k؍ "!ُ'Lsu@KPh$'s+C/+ cAP @ñ ('bB0y6 zxQU+ TS*-Fsu~vFI:YoEe׫7_~9}&^ .s7oEx]|fzmMU<62P~>`WFt 4J +6Rw`|M(w* ٨M-R7R=K>&i] 6u%N#TE=qu%o+DLXf-쾝 *Ger *Xe]^7W[>&cY+QyVTxUz0uو-AMA!mdy:kuU=o |oDro}7qGq5_:jD~A}oa++(mG p]ϧN4kn'z y&ZdSˬS]ލw+ tJݎ{innSX 7":DVA1Ļq:mZ݊ݦnm.tӂǞ@3$'=sB >[Š1* ?8Z`gU0[ӮĶ`tY'2VMgV:StA8;fÙC4wY=ahyO@sEﺳ`!o_L^:jr&"z0G*^om1 /v[y.w+ܭ-G3z%3LSMVQIЉ;z]rMF^"?0q(U1EI*=Ӽ^N۟/>ao;~l 4cL 3!@kZ\3Wa1zf1@5sGK#3Ҫ6 ?~Ȱ z! ψ=zuTc=+&GfCD ' pw^ݽN`N/qP?͢5<1yjk bb'S]+:ZHbBa|r AP)~=L`p\\OxZ ݪ0RQiiR72L>C"D'y4XFɑ5S[KmAI= Gz"AC)^yPsVM|oe?O7]f|?&v??ք8PW4zȘU){lMKWIV”JBmnl+jn5onɣTJ\p  >+MMY2>Ƽ({53E iU,Ypv zVwr[sDE)p182V!kk4ʽR>#myek;*pQ }LO1Jdi'ăev@75w~)s0o\,{IA%M)#5U^uu*9ꃙH%Y: +P6vPڕ*|iWXhw:#7^zzR lu`r:{G.nA~5Us+B>Y[|BEH}׻;4 s=d§ 5 .RW'̫l~a`jrJ,yUbiKO9IiPş3eAؙ~ tQG9̈́k 2̅ NV;[HzI y->W㼺Uz7 љiP}u vVc_p!9Sn3Kir}ym\n6XC8HXaE^DʁovG"zjy"ň.TܢeUq Ub-weIz#j@nch)MHa`Fu)JjAabGFFdfDȣ{8uGjb;5Յu_hionGH>e%n hbYH Idkeq33n-\d5;+q7/T9<*ՙ {n# cѯtg5:zA7r38cɊЪ# o?v:Www`t;h M(? 3t tR6*?H(5x{r7XU(JgQ"Yﳝ럋}`A9^B߰"7?I% ۫[;)jxI7LPC?u*]ΗA 7_#§mx3UW_z#yܙٯwy앛?s91g;r &w #y2'/}Qg׹Sao''>@#9.!P=իq8|hd"y⻚.m_=0`E,$="+NTZQ3d+cC 1Yʑwټ9J5D( Q( Q+j,'/V̠[gƇ͍э޸'/T-LS٨۵5|ƃu7S E6)a4RCЧ4Gˤʮ+ʕ`q#?'nA2=rb"g /*JϾ-ʛI?nlRi vY^W7ݖN6.wbiNTZӂ w,C&) `&^Bp2%|(sA5 8Ԏ[33Բx 1l2-IAh&ԗ˨%kdTRAf,LƔ$Z|JG/ÉтJ,IAH vcr}$ uQF*%'zC[AIS\d$#i@s;jŔt7FXDl66@RB$E95T:B,xjIOIZ#Ȏ썓(vځ)wED%S* Rxd:Z·a\1~we'{|֗(K4?W[Nhfԗ˨קR&ʴfoP_.Q@Uޥcu:Wt2Y,D[xu0Td|u2WU?0MB|Z@uЎ|qb b Z ]=3S'CcnV2iEHkD/ h^%Yc ha %ϰ'M & @ŤM״^)A: a߯1RwhtL0#n1&%ngdrEGM$z8*I**0x!|mcV6ؒxCK4TTx_TiOT0#fKFx͠sX'*C !~'- FxNWi9 b28q3%#tA<]$x+BEWit,Vc ėLȮޢ̘EKWuHW#rRė%5. cfKFXG*$^ '4Ykly3.]UJ0kA$8v!0_v E|n0Ņ߮A0\Z ƏTrH!U$V^xAQ^yclT.yr%of&ԫ7% 2 .KNWlU&ڳ՚"۲r-|I} R=w,grFܼ@Y\=6,Lů"ٺęrŖ³o޾;ȍfv_'h9\[߿~j0zQ~>Jrۯi6a!ؾ]Ͻd;ʯwvvzߊFho5ŧ^]_IOvFѽˬ V|~u1I\)(JD9y;d|ҟ|T) l[85'sG(.B50b^ :܎rчR[֏Rlmԥ*c4]xjM]x@86E Ϡx'+n4E/#ŴقP#툐Tjq6\FTq@uF53jY9Ee4wAW*iMd%lA ͻk1{+"$rQYaSEp}K" Rm&duBeTƸ kYe( rQ˷'ӥT]Vx=K ObɠITWy9r)ii4i^2Bi.Ѽܠ4V "x~ITDKH ^"`5SUFx .AZ !usKa]W&e}btbR\Y--1+U2še+u364|Ra ݢUV4T2d.7v]/0zbƱ <וZTfIՒo 7}j5wqӰo?=[ *H(LPOyGn,.m ,iZOfr eT5\S?p*UHRV97'o|=2 _Qx xւT5:N37fmkt:劣D#IkS.\ <zdw8üngX&YǢr5Ix3}pe6aFp_Wu2RFqiS$t谛4'Tթi~ bLPPc{ݡ2x/(bnJ#ֹ-MB)pJ A>*lHykwg{>Y?>Gp?Wv\$:Y?=xloa:rxozDkiFRM/mfOq|=Mo)373z^a?;_f⸽e>u SSE+.6>oeb~$64'\=LSyQF)VF^a|+-S)@ˠ v0ӻl~wY!wgRG :DXBQ#@ N +X8RawԒKfHDHc?&"i0ؙ\7ɇ GY[J`[~enunVYU䤜 )B(IБ֨}N̫6]wԆ:&B:PRZBނ"Ry"bau)&yd21Ua!jaF!'X3,@ ICYGcgexP$DJ2D5Jx$ N(etDd.AbhH陊hi !yF$O$4? %{E} pb[xW:2dF\*8p8]A10F" Ob]:抝ۉ@5\ rVFoQGx( e^kEҳWb%^)fJ1GG N#^2tˏgtsv2,zT X # %2rz.voOe](/2mϐ1WZuS3g1W[ ƼL QWTPpm++V)?dM_9IK[] 8S_lH͉-#O@1ˇ٣.F'27hGSD2xQ6Ot{KOByV"p:6R>2>vRJn,5?ya1E Idm|>&7p]ʘty𴣌Jm둞٠4|]BAZ2|X=E3 sя,;Rdޒ ׆mxNJvdaeBQ*+]ym0-!<`39H:9썜zs(UW̓oM-/wj%g|'BC1T|$t)h2SeA24ҋ;FR K6Z7nK4XZ^ ڈ }֞[ R8CklSkFaj syV@d=mJg3`}:O$$Q3_>~>UVaK 65w( κQPʌHZHn@ i9Z}VnfԯQtu{f#qlgN!EiP.έYm#.OO^.> 0̆eҲw1;-A=|2jmǃ`Hs>$Nq;1AbjtɎo5voH&C-J5ċ@DmCP y~\D'x_u{]J=t'Idn) xS jD5< hP[GU968߿K0 FE0R-=Ln -9͜{0`{$ [5kWK!=?W?oP|OڟK.J+Mnq6џ&_4 [6 i97+;3F_3w>l@05KNroAk"-EwjG{>_ ןV(S?2-Hdso~V <A+}m>"+/=y%wg}2PWwWw^iV7o}7>mV}Sӻ;ӗw߻1H4Bͷ%Rld{k!3x{e(w'R#knl_:|6_Ӝ':lb:V轪쁚rax 8ans遚(t=X),݂Цc1QS=Tjns*6d@nh1pDM_VBOWz]+IJҎə2P0$M°E)B&j (%%A} Liёi0.Ka67Q@cޑe]1d58+>tG턵&]KXkw Idg.!ec;d&: 5'Zk3]aߤeHKRi1ITYՌ)P(X?c#4 SuH:FFѹPZdR))u}I :ꖧ%tM2-08dAx4K(SGI UrI5e871VG<UX>zL\TD:7B%D0% fE׷R$tC+)RN!2Z((ZKOvٖ ۶7G5EV]q"[ dvbs#$p$ntgHXG:x%ȋSE'l0vǻ} GlZ2KWV{kSoţ 9e轃J.Ji %b ټ)yx iyW UwPtl @E-jef ¹F&0N5=迥ef'l$-ݨȒ; M\ۀs [f`R|b)K1ӑ5fi f- Iƚgm0|̡8.h$vb𣺩TAlj [E-w+3W;ו >$MruCCC;@P$ %"AdPQH`&4Vr$IݵEhf\VLZ_=Z^˩lLA4x(J(L@MNwDL}R+U e$'EFZ_ LQ;j晲κJDJ *e RQ#[#<Ɛ<ۆ& v(i;,v,XFJѡ6R,wB" 5q'0Y]\ײ qh*yAGWe#j4Ǒlm/ŞfW ׺rr7޷GlڰKWۜyqNaǃto :ZSl<#?˴GxywXYYr)g cZK-}]-|X?EAWUlg?}S^||09j>QWzVi/8|tUJ% -}r#ϓa+tv\('NHgxc Vhs7- kxnI,-夒byzܔ? /L>0iqL`T,GLuN4g Ÿ{n~2Jܟ:$iR5O]-f ι 1)ҸEE^A4:X7Vs>;sVZjWtK>'?m~Ntf#ZS1Ю="=^@xG^>ۜ[PZJIEHAg >ꔭ1D"B(5{J8cZVfm{[r&eM(B, ȜQ lxiа"4-rqŒ^_!+usk_9fӣqdY D8Y5w~y?eRK<#[\ml=٦mNjZm.skZ~F zHge;2ǸysN=d?c^Lɓ}m鴺T3AvOY'n9m|CJHh޷SgGID~'p**ޕ s3wY/wy9 ;g{qz|r/^?xn{ˣ{9;1xo7}xWق|8K'*7?e>UϚ3ooGް?jL~K^PzʏV?7;0O8/{j9 -Sh.v6= ܟ~1|g|z%pKk\Wtq'xeO@Փ<9bw zfCXk7ʍ_\z3.9OgGIMxv.yoqQi3yk~;=Ǔ_ /WV.|pj(XēËW<_yGv6 _O>}q1_=Kp{)N[^tƙ?3'E>jp^ϣ?\u5Oqs;/6i~;;*&t}e=koGЧmۏ~Y,nq.,)2d~CJRJ=aHTk+OW/O Ȋ_ի]}G>K!Q=tPlIY)zu׋^bz8wg~ 4m;.|x#;Rb\kwʶޝ/-7[1jȚrA!uN.խ@(P7|x}r4x}}կ27(j*)jB) )zKNqg^.'mLׯ_m}Zމ2+ 8jk-oR|6}}@D%1z|U{FNLNP4Ldy OJެyŽMQ8gkGfXBޣP A2<Cy tq:fDvdjmrx7b+V@cym0;> QLCc\CTLru=,j pC}b/Sq) ;xI {xث L +ǞNmkr,jӗ(^gIK+0恛"Z)5)& "C=.j0{\&^4sl 4qy5Oصd[Y Ut.NuiubTB] Erd{[a$5 KtLKhf( ͞uLș@͌.gI40[Փq"v;؂eͳ. 5B+C( L@4V&'3R*EΆ%:Vtz5(AbO ؄2Q(;&8N~'䔙2SfrLN)N^j)sNU/.a:t lf(^ VRt D@XF1$ˉ@'l }[ '_x]6]*Fh.)K9#PGL^U o|btv;UCS4{LpUJ)t`YlL1N름 20x`lSv>̲%MlA!(JD? \]77s&Hq3'6ok%d'-z )s pL8=A3HkJbaQ5"G%M>Z]T7#f8N}3=vޤT`*R0>VŒ]qxDW+ɪ$s-بf5x.us7"7cDnzR"7SfL)r3EnͣFnznM"7AYG2drrPw/CKgb*}gB0|P"Cґ:rk iZi']cYƨ[-2f4&LZf07ȅ C adɶtKZ2A2 !ȝMkDB~JE7R<{AK̕|69wE;!isH❻ƺctb{#Ӷ>y,٫gz$ʜ<i: S]7W1LQM1Ih)'sBk/)=Tac©UծLL!`7CC6 ˻ \eSPy:e~< ߋk|<{{V=L48@\Ǟnp[u.NǶh47It#hcl! Ew %xؖ]hmyc[g/"Wr {gM"+; "qgq bPX6=&Pxy~gBJiptO%ܳqq@5GTxɚ#:%$xr) ÜQl90ck pn@6!Ia(w>} wMP 0%͂ BBghLR,`(c. I$z:-9ǚ}OI`NP=OtJR(L)SJҽ;{=#Ʈ-u)}U2< 6oGjl JyT/'iZ(8-D%-.lr-\S֙}4%%F2vN c'~ο+|$ke/2_4_dL~/2E&yJH/mM'-n_xEt5S<;ǾCe2 `=v.PE4ӹ8ܹdG.{NBB5t`2 sLk@d h/Hbl<,wd.޸}ID&2.ymjճٿN h?)]]=/Ԡ2?f8C&!?dLRV|lVj d;eS<|xx?ZޖA[f'gXqrB[\/TLՑK݊[;Z:N!_7!]sqGOd29g7h={\pb=SA-ٺO~o]]lZ)5-RxVuh&J[p +UnޒL3zͻ@G"%x1Lҡ+momȄ7$̚D"p],񺱘9SDĔ3!G.& wtE 1s!-!qI^ﱂDs!-UxV*bQF!ԑGb?< HHU)Xo8*1f:`ѪL$r֛=4J49%"yw^(0>KNẒġӓ_it!i:¦&brZ%)s̑IcF=H"җL`FW͙[QNK_ϫnճxl2 &姿ys1Lr߿/xB+R+2 0.8Cq+W+ Qo=2cgAJC]dȝVjrnsB!yQn-.VszVضb2=Lc4p2G j)n*5cHusEj9Wm8UnBEw^=IRJ9`=ܸ6FA\hõG>$]#8û{ x+587sJk{hyUNڴzjin{c0R[^40ˮi-wZ-Ē bhonZ>:#]!_ofP cvUѬ8xU#FckG#O"䮾/[y$/jΟ.SXN0[x[|ww_=xv;>v|Ÿqg״r{@ [&sӲM]P1H]z2٨ $C~'n_謼O-Vekّ %l%1%, )z!Fȓ,CPTq]q~(;CWhGOZE9*3O -<,*4d>k2:/P3Zmѹlw"Ӱ'lZ*/ړOIIcL$'bkbCB!xJI( qUvDbTr0O҄Qx>VG/K^lB )n=vUWז"qnPY$jQl1m^AI+qet`; =+CDG M"ѷ-/}1I{,ʄٴoi{Vph36uiW-pW΢xkMky m(jXg;H',xmxf`tCrSnhߓnt_ ݆⠆ut{[أIncpW΢<%*K+ɹ77K72;Uv,=ח:Z-IԨK8}~69{z=ջ77/네sZ;rZTu%Uh'ZPhP@uY(/nȞ TR#bh!"XY|1ZFo> ︩ڨyFjnFF_wAWz#jzU!xgY@}~Y_wTViޕq$yIV>,z_kG0 S$Cb3r]Qʑl0=T5TŽɦ?x*GbGnGYNdDҏTmMqn4_70;VEyH*]cϊ!ݱy\ +n Nd3THTiZ+c"ĩ!/"<2). brV*"RKƟ’FJp3ޯ.a~Tذ-f ֍CF iCL +*UD^#ZD4cE5sf) <&CEgU k)hy3φb?[˓ʆϳ![4 0{Dx@%ˏ.,ug]jIUwbXVgsdj҇K BKn'{JZm4ZTKYOUJf)#n,<i~b1-URZ!;R EN,=f6v}Vߌaԝu% f)an,%,KK!T'YJKiTވ-jjY/UH)%ꔒ [TEmzJc@1E jL)T G2RyA>,7NIe(ܚCaBȄOQ ; Lc7WR GJƷՌѽ=$)쒥Op JdJxz4b5wc^n; u./+U*P:E䤢W9@*Q99%9eK:)qs֪f1mWV[ϬUc ˤ `eģ+&NQJMwqݡ-Ln/yV r4pcKAS)A4攦8e!Tm@$.ZcQ)y\شN4>$!  FȓHyHKjox<*"_a/#ZEh_sz& χ&·\FrܞꎌH{սN߽۸uoK njM{CoϱËuwߎ؜:N`}vۛshܽ3[VK$X= dž."Vx)d4]t16`G'U|fQe|v=_tW/F&z_ۏ4K24" z}݃]z%xz6u?3_KeOfwe:L3fkȔJ_O,쾸e,l6Ze?\ATl~\rL#BUr|mGY5<>iKS\L $;.LEj\XK-0>A=X+7bb q ! 0!Zr""$LE]GQsƓhuA/DR Y]D(+eЪʖ7ې T>DCŽ (&)np59 )C%@.= \:I579T)QZ?@ǯ,BenQI՞6'Lz~j@b'O*f`A7|-ED>vZ&Ƶ{;Y a`R3!gM`->$=JQ[0>:4*0H@%e950{# Ծfzrc1FnwԓXR$P3a0"u$Q).=AbRGG},dH5@Q:([Sd)NZ]ޗFeQzJ+l$3 } OPhCedO~&`GW F/c-|?ZOYZ!!W0Bg!a=2"8ā"~z(d$P!ٳIZP[…cF&X*'n yhGu ArNPSne̎>AMgrӃyMNeSײ,*$ԗ<{M#A/#5K!@Hh`|B?y>DPg d>Q(D @((Ԩ00ԆaFM>-`Iv=obG{<5r8~ndq5^ϟsIZ0BPwHz-#IVD39bduov7!"7If34޿{h<&97qV- )Ɠ3Ky1) ty05g 7~ey"FZTc`(7 >v meJ"P瞄0{[6Fd;d, WŁɗ3 lQ02:@hV`7ɇ[& 8x0û xZ'zBlL͍OdA4!},PkR0!̱ E"D0Z0<m`Vׇv{'FJ[zvw'd ZZ ;@i+ΉUHMx*ڠA:qCf#.y:ӱ?w3&`p/8zRgo;9c%ޘ7Iv#Ct2N.6/X}L&8 x5z/ui\rXjCvگ=ZP.M(+iPJȲ4ZiqםW4)kIYlMD^!V$5rS%^0M^ͽ6>Wv}|nsо(R&n%Yq0n`2`ƴ= ǔ ۭ o [R9@N;Uj#\a.BY*C $[7E ( 8@tbF;M(fU?\aW7n?g 膿z)YD[?zOCy=`F(Io++4\A`Y3ʮӝ]r psN5'y+v1R5*&6K0f_ S/?@8gE×O;w f =mrnY!O`ns\!9p~VL擺R_?jq15;2ѕmo:|eD`2p_g]jJRRKclVZz_K;ki/jbݘԆNP\7[`H$ jۇ3/74ƦZJCch~2k3LA"Wm>0%O&]GvVZP;XC"J'۷u+ѺgNܜs+`sꥎQT(<ģzi5Q4KTSn1Ibd5^/zƳj:l Fg<=aw\3 ˳;otW_+ٻ׉ꘒB=- :$/t,U]Çȁ%ڲ-QVqƐ( d )m& >Lb3#BYQktc5^Z+ܺ܍S3]}TNMb9{"Ц3;b9W5/0c:l&=ǛR:\]Xw&m)֝ݔg FTT߼:'wWNb]oHWٝRahvЙQ$IN&}_]uYI9_zU猪Q!LoOJlJ!nΊqd1?~;Ndfwb >+ܪr_uHq bSX. ,8R")|,pwsmo@"`a~UFkx#{RDXv}1OcaL'rLT7BR\4F>`uMZ7l`, Z!z/#ǒƶEm!\El|C 텕W2*ԪdG)ΞkQ%+j\Oj[u~'k޴ܴ|u5n'4 D ]UZY;8U8LE =hP aE_A'\,!p^9O78:X<]9-6aS;@؄aAhjeٿϮF+0A@~0!ҍORD1%3?(OI*HYWoL'lLq<U5۹b0\3)2+DLآtJi)82"c zj|CM0 Ƽ|3]~~Yf_ Due*>0Hd@;L?]>WOqWm}2Y۟s2>w|)6QAj2ƅk|?6OWs%zɚDMlLFOV|/q ?\;6'ٷ!cr-C fu8{@7 [?yE/vq?[A3gm ,"EekU.wUE zxvV۸&)~P&)VF+a$86plr_!3HN3,7ƈQjZQas\cەkM\sRt*3&VOPZWW\燏)8.xA?VFW^wg1FxZ>bȻ 6mFvBz)۩Dsu6+C ɻ5\g&h!`nnW>1PRH>6aD(:}YP)) D )#sFqefp&[-9k\x8Cj<~Scmau6p6BE"bHJ3=P]v\i|X-3UO 9Ό>¶bGi;f5Gp{?uG9^s\2lH]J]OJB[ 8Qv`~#fu()B-'Zwȅ8zk]{zbUeSUe}u-Lh|e]+6JTE!H8W0ZOd4R0ZI Jf6<%ϵM7KTSD|oL7çV G0Lg-*ϝ;DGTYBw(Liuc^[ B*֭;ot72iS[!)/tWuS^g-V!Љ}G)+I˱[|Luk!9D11J 7/ z׼,6/#$pAD%CLXLU!+oPt EY _L) fNQ! Ò"jk*~#RfGE8XN7C5w!.X0`,GJ%6,R#U 7?#+/Am\PYò JgJ;}'VbkJ{48RZ/ fjA+JIXT?ջz 譅؃$y(ob$)}}V'r8 EsIjF&Z$pS3g0")S 5Bq0dFa@pstͦxy8YZNfY`Gl܄ 8G`$bdFXP4T$ f fKe䔑Dn Sͱ$ՍYD $5,*8 ,y[L.2 ²Q Id4G*Ц*95J+l%(7 vň$X2`E)}Dx *dp k_vlvJB&G4>4>[ *18w xwi螭"K3{)ݳUՙ_޳Ec=~]\B*&9^Vz;leT"]~B>hyzwߝmfvV\N<췳b]>BofY2(/aG-pXQہGW!8ogSy^5U@7^V(YT}AFPha6=`f$ L»!ܤZEJO # rjSF/7H'RPJKrj)ĀFiI׆wҰӊj (=is9q.QgTk2SF <="(ÎOO+%\'RP7(: jW6Oa'UC8( ;=؛5QhJu(U(G&B!Bz(偶UF>z9otƜYzm1eUm1SzI)O$E?K{0eE7vސ߯8So&y]b1xs]-jѬslNR](> UK7+a0AJt4oV(g[򤔂μpCzw߱f@b厖rij7Mb4 ƷΧ7|og'堉*stF>HwTdø dlP$ga:9YeM yI1UTQ4kOQhHU-*Ɛ lw QT#r`^%[SY Y &YM ,dL:QLi=LJH]z~y3RN a9h`؍״AA&RUJsA;X%f8v^f$AcCӜeDaQYXyP DXAML',]f*C ajMIQrSsH5\SHR 0J'4˅M|&!)cʌQS eUE@>=mtklzV^>LL~~܂l}(%z|☚f67)BV 'rL%gx~.:n>߸Ы*'?be+mH $+R}Aϒņ bC"R_uI )Q9H.ذy gu"*-օ3=W ޥRR6ho6#at?2PKhSObnWJ#ioC{^ N~cåO%r|6Zwi<Fp&?O>N{wyȭO[WFGe:_LJX|݇F?`E'W aIoKO4bAN{Wne'K_MgW߽wg_|;(to Jo_{-) җ^ ف]qE?;g ŷ@Yr1/}%++FU2ke<7zWM.V~^:f GgUf.VﷃE/_.~gɗdP{MCy}bivy8hwȯF~w<+w|ha탯ϢqJ)l'0/^|Y/Ab/Ë/nbV;`gJ!ŧGYgύG/Yvq<Z>tD?b/%5o FMcI'1\='Ioof_lO!$ܫ7~+.c;GG}z=_E=_F&Ÿb, r_g61@Znn4~ ʃ-bzvhѣ+Y-4Ayk{[%vp#n?`rx)y)JMydd b? WX߿q~.@7JFt \;nԒ/!HtmqF"@:?&^ V+J=|? bF%UCzͅ&Ha+1;Bܣ,:1#OwӮQ y@hgK=H 6mx``Ɖ1h3&23‰h*{g !PxU *{C-}R&\%SmU!Q1 8"+'@\NAvdeo?ypBwBZX<$$)OT&1IL.bk$9 66$q$iM$B~:Vj% G[ w\ xNEs9Afj͘DxvU~[NQl!Tt%F$l9oҬ8XJy؛'q9T=K9lSYJ5vU&IЦDe9$&":6Fr25,#`.%9?rY.e8<6uO]`};'*oͩ=j.UjHםWhƅiLe~o]vK$"+pìn. rbɾZAH lMe34V22-%7k ,Ns+CXTJ[G)%yb2* =F**UO,[DlMoP(ixHXŚ4%#d9ɀlQB#XLЈɲ%R^' lS%Z!·=bHoL{ (Be"1 zk1!E F F{f4Į!u ҪA?xp{1Q+)=ni֣&^@^b`ϗPz@ gQ%>Qzt}Ԋk(Gc,߬T`J{&^046_ x50^r4q8WVY<4,hcop-|>"s1`a]F%-FDsa,0XqpMXϸfT%Xqcw -H [5ྯV-!Bu#T֥BSB<% >Um>6j-U cKrE*P2p [NQ?fיHm 6)F{Т7QG12 .IR a u9JBoeUV4Pn cT(pݪUh *6T2B] k eF6 }*@~kB6BR<0 o(U恖%Bh`- 7a*je\>UhQ[I$p3 H &IxFqR\“(c^FR hF"v}%Ϣ8=X-ݏ%稧S"0"xWKF;EUՂ F6Bmh*nV "ڨh#xf]Bs i/ h F++Q +78 <9*ƍ1 -Od5HBKʰ$#`ItGTT?O$0**]*B^ IJsRy)?v^f@N 1:jFVT4e1:5ODstQ| /xt1_Ia.EOS2 1e(3(Q@^a-ȚtPg;gG@  vΑ9Į#a`$xY8)}sQi-TD >V4s0hT0+%a[HNu'\ 3tk*ʫk%QQԥ(do'm!{-,֡iC”(LA=Q@^<i‰( unum©+1 ,>U9Kݰ8G,>@NJZc@t GZSXxoK9FLjayQ(|uNXm0^p!D{z3+P8& 䅃2 +N"c,N"~u0RpM8@u-6c6IsLWx$m <ʜkL(PTG9f>@<>UZg.zVTSb tG莠;=+T=+ܳBVEu>(p#jyC1WǯM8@HESI& x6Ո,8)CSTE . ab4~i>ɣ`ɍJl_ Y| Zq#-{ERTFA(חWEVlN&Ψ} !&IxFqR\“(c^FRJF"vs%Ϣ8o;Y~ԾbjT+`.CkR-Cfg:CLwFW1QGPlY ͕˫>s+֋+?#EHWpuRE<,ONsG\ӣKD_ZEH3߄h01iYa%z?}\է= Qk xZXqFô5+"_030 {o `kƇ #&c3c;B $ӹy,3YO~oAskc.l0nx$LYu'>l8%&z!X!-PGZ,q1q\R\x~ݸ5H\P]ZKw:  oh]JJ .IK~?7~|zk"zhslfGdR8+ِ9l8[0{7!gˣE t2nOYVOo)%QMε0*I9OlíXi!1TVƚʈp1q'Q·$j42u:5L2[|mK3H&3Qr:jň"X½.И ڥKkeOB˒QNRe?/W w"W4c؈c >lP'.ߪnl^kvW2]~Uxg<ZoNaXdƳ/IvAxlǞ2,Ҷx|uKlw(q!`"4vSnXR".TN?;ۗ tႫ"?S3[gf#K]:ecp6B R5:)zwj7%\klGS:''GDIZ@xD@Dc RƔȲC5~>$ʩpPvC<8+„Q7D)ZnqD܃rQqC{X(01{;T!G1A34+*TAVIB1;B+.pSRCXDfɮ1MIP Qi<ݚnECct>GhPuSO B es!*!1Q(pM,(tQ_57> /xhٞDZXGM_J%-K_v3IJG"%VrJq5sTP\9dznσQv}f?vFWO2˟0| |xs;{ 1amrJ+~WNi3ḞH%0mpZQ.u4(U&o}JVXZXqKeɽ,9RM$e-KTܩP! ⯎_l/yk;L[, xO #up%ůfyeے>2-SP?|W5&I)͜N $N#XIufk)ҜbmIf]h6~őyl(BjR(\oJ_0و#iȇB >&KK@"eqZBkAd%y!T<>w)XkrWP^oW^}Ad_|BA6]J|ëL*&LG-dkAʝP j4vѵ7oOa?L_:m=ϡPmphJ54F8Ӂi8%֮P‖9cR )#'9M&]~c`Z|`>+9K ]ޣ_:w:D+v7*g5}H_6`>gś??;^\{A ‰j!H]Q  D))/'1΋B idY\ dI)ZXgϑ QQݼA.!̵8-^x_|~lo'g@/MgF¥S}$q \|՜ qg t(̢o %bz:;f$PnZuVLh(m^!!*PT' ČG-[XaթdHRKd3D*#sS-Sriv#R͹toޟ5a4Fufsk>G3 rURNU0s,)c>Y O9y|zS9tkK Fv@anaҭ YO;ؠG yC Z8(:mD둾:7zX|,SȲHd˽Uu[VHNoygv;~-| amr36O9ybsCViz,ޑb飐r\`b S=IoJ LWO}/{c%KXc/P_c,汌{RbߓFֿbݼ?l#UõK]^ )%Ub7 Obxkmm\948c*W$:lg'BHzKrE̎߈ kBHSd N: ;j&W+Lav;(2o)9FA EgGHfMFkpj^zh9\jQܾ.j"* _P Q# ?M "Fj_uԠ9AEaW]An G [}hOz#aݯUήIJx,c k9Kפ^W|u_/+* ~щner]^l[`KXj1,5z?XzXjaCp zسPoaJ. LR|u d=$)=B@"WHu,op[PYRnԬֻ906(9ꮻ[ʲ{  ;2]wB>}x\Iְc{icӛ7UV vՌyn RJǔ&=#YAAHT/`Lwc8Y,b6O(RKRNI#^h 8p8S]_uuuwWFuC|;riyM޲m)nP8q̠P*l{:fIH(QN9 5;[cj1F?d4k C [Js8Fei%`fCdzPxO Nd)U1Ub<̈́RdRS$ujzN alj7+26b$Nh"2a$$*lj8pƈ1g<!e %\FO US0.9bTR@BR;rP4!NthP# O jAR$ HdUJG t0oBLXTJJ"N71,$PQ)%Ćtc+mRch-9i;"E2`Q1"J;j6O%}jH2* 77N%$ 6xL( a)i:Pr4"$# O?}JɦJ1pn5&EfSq*%Smc| 9[glݔ*gKk^Iv薻ճ <)7]6@+X7|WM;2w&r_MO5Yy3_Vy_ -~#+gUy3:%mj<=*oJSä5r`z7 Aqͳ P1Kr>!ؔQE\id5"uRWE"7So {cYS Z5i W g_7Յ$56pw7|T._NA\.c>CzZ i(D͸Q EQs,X{D͇;m4Qs~w(& ZbHϓ="j\Zu/qU5Jފʑ;CàFUDJ@21jo ǿ]m5 !_u]UT>y/ m}Eܧyad@m܀*0U!,6 !*!:qLHeJ0mXj(y}dK^X: bw1`9i_?D+a!DlJޏ[MZzW}G]01g#[M˦8o/n^n:;x.gњw?nuX37уmxOnlBSؘ'Ʈx*ny+YfXj-~zw8gqpo'TQ"%Rޤ7j) d=TaTjç6tU@KPe{d;mt*Y~w(T2qp&GY/\1餒r%-fT)"= M%V3BEwtƂ4ģR1?=co*Y5f)R_eA,(oU~Q G((HyʙMj<'1$JŒHD4qPh_ꞮyTk>ڸ.5DIÈ&6eayij Ja6qXq$?!SRCjI T|;2QF8R"$i(IL˜+Li[hM6w|un}G)`Z.܃Bؔ!*uz7_`LU[wqԶW ۻXV%X  3ʮ=N'(-7aDSxXSD߻kX}@SP7a$w>鼄dDy {3K$Q]u4v@S.T@ݟ>bHiֻ5+]`ke7 ndCXa۫PDYO$<Nr.hCKԹ )F|;6*nn谔uy fmV@qÎ< >>py鍃s@F/  !Wܴc`SFTha,-vVEy%e ?} 0a$-]em 0#΅0N-RX;b'ϩ5mF*IBO#lT6V\Z CX*tH*B+-&$Ҧ~*@ҡLJVPOU~RfVvyFX LL%J&YKh ƉS0)FQD1.qCcɉAE䧤lJmC䣮̑"9u'b@OOT0;7'&:RG6СI%KĨjfBAi[=! )Ftig>]wi5q<|-G?F7.ٱG?gd!wC|S˂ @@,!?LK &(*QFʖxoiuW o\(h{kx;A'Rkx3]wisJ,gM5n,ޠwpEYɉl G"AYRo=8(#.#SﮇB;Wfc ȘFPa2SRO6V!>[ IjQ C&V(1TH'j Tɑ0##G~a[+8IL #ykT#;W%rEMVsam*ͼ#jId(%J=g/e˔*>ooW}oGͪm41v=TL\֔˱5ZMqxf0Upnn Zڒopoflnb\Qi1.VGjܮ#;^ݿ7=О+b 3h)-t˻}U^ֽVQǘXu'nUR %z%5gGR#-)KstIZfyUl֤.d)55L=u*J5L;VW;0BؔcwYt|u1wTn]V5染nuX37Q/wSچp QFgS+4-V=oo!D7l!m،[k~=͒#ϳp*{ԓMTjH ?pixʨ!1p:&jw_Kpi'Rk1cP 7#N]7s1wIE@5O`6o-CX/.u7n C$XB@^.n̄.ŻN'X8N?ͻ<# ";SKivT}zKQwsư2PLJ5]ܧ\stSEn嫓E!zQ?/^_\d5|wupZ`/r \ݫ]l2W M[j|2qЫϯw@yُ2ܺ,^]._򄔌;PwݽEA|iUrF(vfo}4OgwNIj1p9,OȱE#lgݘGB)tUvu8f(:r .9Y꛻:^M$O'<{6BeNjwNl}|nA5o7BX1<˼ 9dRS͇ 6Y)XIјtlMiKu/<GZP&eD=Ɛa9; &m/ưܔ9uewW2I(!ӟYtj_'4Z,RWlJڽ^,0S(;'3{jw2МbNYcξ^07s)7ܬ6?Nc%WjȠoΠa!\X4Drz \eRu9,p2m4 9,8ai>v9,a!؄iڀEcXeNȩv!5Is`}wV;bd [< S mk:Ƶ z;J]q29Yܮonח$kz}3LGË#nݜ<#cŝ')HLە˶*EGɒVK[xo:0,ɅolwH@ hX1E >7(355|Ԙv1/>N㻞N$)8N!9_S5>1Mf~'i2c)3wx1' , dUVE@2xʁr *ʁr l͔`Ye*r_We[rd[2dM% pgwUb'+s>.BaA,$G˚a >TS-wX S`RP4tvFajȣI̦oe0gl4.su]vơ=rwru-/nˇ|nZf  Υ#,ۉٽ5C46?']K֕e߄xt!Ƒ7uKEH =&(;w߈i{|9/ܟmx@!Ceu5'N.}T 4NwT('[<2+$RQ"Ra[&bK Q$D*&HtDCJU}8=Ҡ}|g8|GaAjaEZ$ }B }TuFࣚBJRBa}), WdTS:j 6R͠_|?(:a;GgstԘ!1,옠nwQX4-jX=p5}a {S&|q{U,-|H>p;,5|pW`bf8 HmB5  L!@@`Ƅ`[Mv'F˛/^<-2/Oi~vwH|~X3+?N-lj0nn9>}P3:ʣ`< <wm~ Xʅ8Ka=1z-Ee,JjIgzf8iphTW")BLS bR$ӌoXDͷ Bㆹ90䍃Ⱥ}xoM*zR&&$ڻ$wժ~Z $#!i/$x=J3@!Zst#j(_->0[y{K*ݛp:&%Bz]5tURv vyyu]֫KU\W)^W?]9:>ڐZ4E 16} ܍Uv]V:G>(68ˮ.r!g]n['f1/L߅o˥=L'RUџVk_̱_IW_$2 bggdRS1`Nk?=ae'޾q; *gz'Z#W"W=[/6V/\Ogvah#(v$ڊdYN0چHF#lў`;,AGqYVłnؕ6sxz69LH}řJ=%(Z+8åG.R^$eMemYnuS;I9FrmN<3‚_L;>s},Yo٫d4<kGc ܭ-QUk˅Iϣ[aG)'^-g9g9./vPmiˉ/*w)Ǩr.SNFb"Z;G؉gMLx8tjZ5)n J~(zy6+BnEnG$v~P9iDO?qA@BB%}w?)BndKUN:K>7\-?ln${JT"L 9MQi4IؑKEd5:ǫ^Cpjf#VϕEU9=`Jg@_(0.moM2/?{q>jh-JYrWi.u}W672h20Eg]N6Pˀ.q]/ _PW˚tBSΒD)"h!jِvFUl^.L{O| "sqmk0X]zv _DR)0LB1OfIG8xhf0Ű =BG 9dD&B1M: ̈́FMzx5E%EK!]< ǐHps(( ǂPvn-#`~ ͺ^ Q1Ӓp_h4imҘedREJ T$s(2 $QBU."dՆeL+5=@0;7#C5Z.:(-R dd)d$WV2yd$yA p;nQQN 5rv{}o]f}"nKbsյI~[*{+#Dgߟ7wTUY”_0pISKM&gG֩,n+DMLQn~=xn~8(cGfQ=>5>>BfPE]jo//`g$…il}ԾL};n{)pdL¡ds-:lds3,8HJ#kg}6|-0㵚__{qku,Qo]Ery/@/ psgw5j!;x'O!vրooX |6%'vkL `_ wH%FM#*"!,&i%.>g&AYX hL(ϕmzf>7Ƙlhet׊~MJ^y /vB2:c.%7P\?6뼐ЦC_ $kN֧ZoxusmO&c|uh0qM&pR4hןc|ݚQ̰q^ۑra  Jc;-6@J&yyZu*6Ypߪ KRalj{X(l=:];:.lV'\ ! ~"p"9~ƨYMQJ WEpS9]VĪVLZM?VM^ j5CK4iV!(\WۯW==)oFG $y:[k`'T[쮴Y{np-@~w^mjl"C/{S zo$]WT@靈$%w=HvҊNkC}+hqR6ulH^K;.h^h@Zuέ<K;E؄Dw%CpY (K(m؀/X{)]=h}KÐ >=ﺦ@Nvn2 ~LiQ- fjjlʧijX4eӵޅq?2/|;z !"alEBx xvb0 OQQr謽#"^>͒ijsӭO|]`X#@u_D X8B/@P%U(*R 5a0]sݹB3Ѥc'ծ5ӲgȘH` Ӂ#iڛbD_&_Z{tU(sr1:'0)͸pݜ3sq),RL; YU#%("eo?cܫ7(ƽiȜP$-p(B ?.:t~]Y尧reoM!Z RwRkA_ZwAtXѳaT<:ZH2+"1n,e2 I2Pfq:27^3{dx-j@%Hѷ{M4Ӹu'>EKst*aҝ% rB1*PpIӼP@s\ZL@)2#FX8|I`5)gBxchB9 o71Ub0j5P ghM12zӻ)e-V!Fﶿ՗T̻Mz., 7(b|+ ƤTOZHq^Gш}o"íݢ eBpmmS\0m6;K !i (V$\- ,>Ai.H; y)!$R"lB>wI:֕EE$1 *eR`E.5TL1dӜ Ju;y8͐D@D|683:<'mԀL-@ kal&oUjrx;x}nwp{=Z\,Izp?㫣ߒ<=^5}vA]ǫ| 'X?Z%a/Rp |X9L.&=80dpaG074DiZBʠg}]5HdFUW\1i* 1ޕȑЋ TN j_2XX yIQ}cߗnegMVfeeic[A2.~ئi!ơ3@[VZMixׂ@Eis2"]GT? ׺G -YbA6|i*J7VXj]{2E5Zjb b㡤78$TPN'0Jk|l"`o՞|BY ٨,Ae[LʖdD6P̋\ 02(kQ܌8C.zdQCmU6@ zl0sõ U%5+ }( AXl 6b1*4;'NkXL` 8~Qu,T.da;rT8 DtR(0ox:8:PibQ3"vtkYbkXDA#|si}RJ }YwJb+]k sKț%|;5}18F+Ǎ$DR0 QRH`I<ٳJyhr7yb@Q"Z{3vQB Al7mPap,mCfH@quE~Ic c-\e%By!"3j:gHyLy"@p_cr82A0y>}2MyppipgaHx, ap>;і%39)K&8(LEEr\ Y/8jt̩ˌN.3VS9:Ց:v\xđ(r%£{RDUUXd#FiϾ#F)oz( $~,|eazKfJ"tR񒋕]["tR~vsҴ%>TV"4Qj!rbOI|.JKXKK~(NRJ5"tRᚋD+u9׿if$ꣿܮJxwڗ)*` HǓ}X^bwrV2PCآ%)^t`L3yiO=Olkbdļ Il걙oc.![WoX۬]ߔ'ۛksph0;2?\߯xj?~# cO>>sŴPZ$p:Y)'븒үp+. qIMV;D_c4)]eT)/ׇf$ mGiнW-% $v&HW6贚*lUEKTͺVJ/TJeuSe8&D˨IKsvx::%.$w1VEJށ9cPV*i0tٸ_+vw S=Dw"&+ye[" 8jjÒ߰bYi/+ǀh n+<]$5mBځK%K;΄ }g &uU ;#ByӲBQX 9Jey`-+&ʊ4'h,GOJr#0`ӵG"״M5]DW>5\ XO^La䰿 @}$w}.$Y R29=utć::AO6C/7kOs[r7IFN NYM"f7kGu1LG'h&= '\A]w0ƃ/d'6''h`6W7+ۨf*WZXmۊvg[QLLX˲[("ӵOmƱ}y*Ł3}]6.U/Mk1]?UJ|>˪B;_N+'>4z%)oz EsG5by&Bj<+(Sk-w%M6^+iF~Ƈ WU@ZZ;1Σ7ZU=ixSON eVb穧nes*KuJlBU@|UN5^EB KԑYR!>͵y%=G=*eיe5·Uӳv1 kUr@Ux}h4aW>xKaGGXcez8 )D[ݻN?ZOAQV8i;39z5xуݧOW>P,QN̍C2!3y ŒwYsVʰj~.WGZSާoZ.G')aUt!Īxw~am +IwԴ,Bkпh&K3lzti4Xl+3MӯqI#_?S 6R* -zN.:Vq4Pz/ziDrg2`t:=#|[9Ug]9D~qU>Ŕ?XC.?DзToVxP@kRZ0Kp$\oMibwHM\8eQ{N^)/o /Y+*=1t_ fWG;:lVzvgUa)"`a Q0KqZ6*e۠7a܊Ͱ*e>J`2BF8G7)G m"ZY ŲnL_RPIPQ[s)l4ƀg#`ߠZ_-&KP"Q\?IݿluVVjM7T{͂I;ZGTKuL {lNcKZyx}T j2v{8bۇuF )+#PthܗSs& $ek,f;]lCl$- 3b<'5!s*G:K,!-ƍJ@GJN]+4q0 }62}+Ƞ}GxĠ%3&v-yV?j2DBS7>gy DYh}aܺVV `eM]BD.S^|zAEZ^KrO_N$h7[QVYw⢕th2u,}g3kFbR QX@$.q6 Y^\>xhEu HM9ԡ/3)6t[UmQPX5aS6M2TawAM 92Îf$18(m=۲fUBmհYuYm1m)u#%T_V\wW3 `ό6cP"uEyhjF*2-uiʪjM]CH4`80:ջ(zah_\u~汄;p×O)r2x,7( 804 $H!]@0zMd5er>F4zވ `MrCf8Pjz]αMܑVq´iY9Ǖ>tVTuZӸ„2TW kVh7fz?6Pb6u;n ^Kh% m F lЄ3&#U $߸WT@D}A)-sPc>ﶗ*epb\Vl (8E'p}rv͚+wsA1L*|gOkfUwuyuFnhz"J.p&(f PlϿIt )f $BѲ\gڡ3c  ("ǃDgۑFϝ ߞxdMYE*DG:h2fe]c !v + w즼]rs4JJEhٵ8`eI!dzzG["8£a$ Rh'x1G ^=3:#rnv@l$89!{q4'b7 ^れh,&HФ5粒>Sް}_%VUl$tvy0flzEҢΚvR~#}(s܋߭k_-$ΕNmmaypҁy_[AKpm'u[ >»ĕMIl=S6|+:5K96|HZ(¬V\A?NZP,;GʑI, &fb7CJgęS~y0Ž+J; v+P l[$T>5hyFp%ִm걢l%m-JW]<|5I.]ŘQgh]2A02yn֭x8|/cYo,%VQӕXu3 :R!vs:7>9s~S¹85C rX|GILM0%4 ,a9@?$RsZbk2޴>`+ ◐)mNOy*Fk 1\|vۯ_..4u2߼?u/~c-y~|l Ag>q⯃3XuO/ϧ +Y=澧ul" (ZEnc1H1YNV-+7Q6#N\Yz8OT0}9?;lK IMd%QAwXw;ǨwUԩ̻oxwBCUo h&^8~\ᝅtIeNbE졧㛧vW;FKߊ::1~S*ͶU}k]WV݉7"춆]RNUF7Ѝ8dSx}:/}/۫YSD#1-fTDUW{A3r˛2(Je#EY6PhUXbvqS 3Z X|B㼞 ! @C&SOkwy[Q,"H2`ؠQ7@D&0}c9pz~j0@R5I_؀ARIBWWm |O'J'D'IE=!@KJ6u4LP7Wr.c#a] YĈ8[v9hgL+Fxg=]?ɟir,L۾vWBj'm B[mRXj^|}X6c-Q2no@N.r ЖCmոjmIb1d|uoލ& &}Ro ~SQ٫;%a|׹B-ޮvP%{^wQB(BbRԢ[d#VބSE 8|bk.Ud*a ZN[2tRR;Uwe BDŽe>F[avniiɷR%=" )KЍ\~fm%2ġVܟZ.Yڧn bF.-h0t)Q Ksy^ QE)ڃ{B7cY<]>M'xC p JRqI׮Iڷ>>arY<_<\`.x~b g5nwm`*G&{ɂg_ HU2xs"juVgoQ#]FUbP7y+NE452)ڐ+6ŷWz;D_ۛU~}y(cthlmG^'h%eqecĺ5NG.0Q)jk}Ӯ.>&([$B,B@kMt.<9=ZU7A%LMJFLktϟ"+gՉH]c;wa.7S^*wR%wLzѡ>wZ`г٤|\7pģX ^БtȄ}R/~8,%;u!fI$c(!{1GIr~/%N/^np4G>[Mz`zFe. bL JA1RH4^:<B_oӣҝGyн +Jr$pdo:4/DC jɣdw'cuf>"jXJbkf 9DfDgHM^< Czx^p +2H[M*鱯.˅pjjգz $_&]}I7\r,Q@'<}jV=I]e7&v-(iB2;0z˽`]w|xf彺l͠lu]9ko\E]%=6w_Ik<ϔk>N (N ReMǞ~];I tkҗd9)H%$5t޼ITelKy@?|N/֩/ tJB.2xX,!mFFQtI0j,دR4׶s4%\Rr(7Y{o3<.%X {npUۂ$}ILBL:Mčw_8Бtf7}9nN6;FS8I'F H T$Tn&iwSWWu[>UI;y< j $J2Owb3:Ocކq}E9``6MLbwy8*)wqu2kHꁊI-uw{ oGf|)jgg2T+L7YIJ%)6Vj4`("#LjL*=gIoH*YLjh:2J  wd JXA#Q] S7*k57 ȝpdt)dx$NO=V=,jH4]3-SvC5msSϦOwhBp#H].3Oj`h4` QJDS\Q)^|QO9_5YREk=xc1nsG@-.Ex=^- Ia]z{&NHKtXz* }ُȭڟ)&G wHXudFOҮR?8찙}5{s:Xb3v&h6E!S jI\ow=Z"`f3*S__q*dL\/Z|mÌeJ-9dX  c=[\^m.\6|G91|IKe~t9ZG8q9T+Ry"l,$6V9f*\<`[QOD^lnb6oG!:&GӐVEfdKK gJkgyϭ)>6T>}z ·$,q*~m٪cyȋ_$y/f _o 6o 7V/bF1Mi*zkB3o.bIcK-{'m̦U1% h#'S  v6!TRCrs)AlAS 0Y+I1<=ˏ 78N5q{Xz6R[\|o;[l1&3{ ʻxwq_ٵ^8b烓$Ch$^Mk$/ddϖLjkBȽ>7!Ħwb{'Vآ؃#AG6%uH=`  +.@b iEҠxo3PJ^!$,%8_z].1g8M=6~Jޭ`>#9 XKg} Rޓ֮d<ѿl#>h)4f 3[ AAC4è[R9<}~5_Ϸ8(+i3c?Fۏ$LV/)@]VYvzUˡ7Dhl Y'QcȵjRO/eԫ_?79}>>s41)$1 쇀eR۾? VTgϦKK-{5iESB9Hū䛪v2voM./0MoWGTb M?!L_3-']T(I~< ]^JouzO 1a* ?80|d"1Y_/G^(?ym}eNxĢ+2(ZUR~bzo"㝌DW5:/~|C#q ͫ/APxCh5SH(EyE '겤1D|wɥDHYIcoz;%{;&kߥWfɔ;y6)7HwC.6sJ6 0R]x:[Oi fa۝3i qtQHcd;30\Ct3DM{@V!H~XWZC  r^xN f#pQ9:gb`šE%Xz7PuB$E=_E ,Ƴ/Lq|#w.] x)H >ɮ&He(Қ)Xޅ4Iqw,S5$(fnj5nT_; EߊO!8 \]reoy>v!|&r4^[5 F 2v% Vk,FnK"GEŅE$\sryfx #RDRHbGemG!rt)M>RADn9A\:Jo|$ī vyfvR7?ƁK_Hՠ$VfM鰀;QM'%fطH&%"~>)TFf{ͽ)"y6 := [O 5 2#, j5c6g| ^0. } @_oUu 6/i\gxv6$:υIjbt2.h@M $O_rD@ Ĩ'1-2j:BvŅ*&nSH¹ ̂#I+wɤhR 8C^PUI*Hy;'Wd)"j# t z *@ ˢwC?~̓ 8*ꗟƢL/]f0ξ[!X$15Z>~m\_ 鄿8F1Yn0 H ѧ8_fʣG_56ᑮ ]ŅXaO$AK5(܍5@Wȣ=uy"O9Ik K nG@n. G QtHNcYb17Z9'QdLlm!x]=(-f2NC>QT\ ^wyEySdӔzI; l'3T' >ϱCˏ 6$A9*ӹ+ $E䭽Z,)!KB5l=Y7{z*^!@-i@o9>/eQ9h-eAŇ~]˶|k{wk~X` Vإ 2ߞPoO 5|sPSٞzpT6ܫl !jR9fApA#(SmhI4C8=hG(9-<4;֕!YviwB$=,|b):QŎLDǥV~am#09Duz \JZ:,֎Qy]HRauyf`oZX묈)AΩEFbMR)HK#9G{ Ky <ڷM.BH%A JPj^FnnVCSp!ݐB: +̎S0 f Akq<, E]\T{isMޔ}Uoj'q9>X7}Wj{U9_b{ϽNCB9EMFv+}I4Fi߫j=YBn> 2a_7Yy][7=_+Tve7-.Wr]}8 yO6En`0gRu}k6a*lL^@8wQ%L`ɝ bʵ= vGDa-KvF.V N2H1 gݐr@Fb@83 !׎[wSV%x <lC¤0|)G"HYNk_E}n5Iz%u*X ̷Z~r/RD<ڣ!Jt n4t@DbUrVY= U9YN8bFV}/,$j~ /毱ۏȦ~ul[Ʊ\??N/ټ] I쯦k;aPH#nAZhd+[5_~٢{4UW~I/|_rm$)1}܀ cDs&b%<3"A'w"`?mֺoz!b( ɳfi Oxq6C)/zw[ڃ׌2J\_\I؃±~*1~0/ߘsb3@?{˫?q!mlZ=Q,V$`UmY "FYYyWW=LIe;Q+sg#3[N PJrX-=Р2$[x=["N*z3Axl)Q2i,{hrx>g̒9fu b}- `ךsA(1N'eMGN cH5)- DfDr`Hx0Q0D )Üʇ.#ù!OJr3`EJRZUZ'I. &\]70Fڞ6V1 LnʘA8\I(s4zs 7pFS"V X%œ"A (9 l=a&m e .;_Z.M%(%+ؚoK[{{75Nb]o3M=\yޟB!-LE/ j!ƳQ%SbR&yi1;aÉS,b,QUT QFGirXhK05 bAb۫ϯ,`΋tmT2\Ò7diAD(g,ЁiͦA`JS[v^DPXT[Mg%h6}!ܐitH?g$6w7HsOoޭbVX`4ɼh1lKW~T["wĮ&Aۑ>.h^ɗ=_rw/%5tMעsƳN3nQxhgio1lD{Uϣ_3l7[7onvq5AuLT[?QӎWp9d^~%&] %@ҳ~11M ntĽSw\肾c I51:rK~%|7Z`O.\} ~rm!5XS8ecڰZљcWFm0M+˫?n/b6mVzy"X q8/>4|_g?dwXzgq\fkjMku~>[ r6m߱ytfqYFgYVR7 7SaEȳ6(>oDg֌ƴ{)5пҳYYs+[*:g.dJ裷ɟ[71X4h":m߱u~xZ+Drh~uBBnk@ kV7k:]VvNfqqmռ7[/9\I Ap䆛WSFI[IC"cVVeo}c 4f/߹IV2˘7^| Q[U}.':@f*#8B }R+&P`"C&-ϘԭA|8QXv0ұ$|H4oM'{P3%kŖƔG%K_[S6:ZDՒ'C%Vۻ޶q$W|(zP$|X w f&XȖX^N oHJ68˱Dhi8!G5r!exűKiуk^=mZCe&P،@4Q,Ael ؈-I(w@M 5e_b>0wX%^^}jD>dA^0g][%iX,inqK$R4u"9? c10VHtݷ]}R]Zɽe+BzB j D Dc<'"qL[ Է%xxvaʺ!l Ng0Ty {sSjֽ#cmSvlX ;\[CNa垶Avt%eY-&rrݾOwvz,]#?,9]SD.F=T*:@6"HwϥY[^Q!" #q{!"$US0C/nǤ!B{2ѐGC쭥FҹI4w4$^鯨4>2$Lp2J{qv,Ίkbvt]14Xl2z#s0Dib]i "Z-&KlDtׁJ*Ru\a n s:XHv0&0Ӣ2W b͋g5ӬD^SW`} vS4wT,ԿPF^S>O5%K[< Dh)@Qήa9l N?lQ_փQz|3:R,p{]]|JyfcrQp ofp%V呰$$IƳk'<#iʏar#ZTO9{XB70<(N)눥 IWWt^ɩp0L 8Y*s) \re]ndd$zy;R#eFyX14_ǀHat؋oW5|Q$!ʮ x b ` {ғ*?J.Xyu~/yd[L]R#.\1eQOp/PWOMX:M{o2ymj>`88E M\w~UwL^Hĵg4r4 9:`qŁ4)ib ˕=\</#`QQ8dN7i/PiMT"=5xtUv+XP 8s8IݮI% Jan 29 16:25:54 crc kubenswrapper[4886]: I0129 16:25:54.306312 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qjqm7" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="registry-server" probeResult="failure" output=< Jan 29 16:25:54 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 16:25:54 crc kubenswrapper[4886]: > Jan 29 16:25:55 crc kubenswrapper[4886]: I0129 16:25:55.226026 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zs9nq" Jan 29 16:25:55 crc kubenswrapper[4886]: I0129 16:25:55.226093 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zs9nq" Jan 29 16:25:55 crc kubenswrapper[4886]: I0129 16:25:55.269894 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zs9nq" Jan 29 16:25:55 crc kubenswrapper[4886]: I0129 16:25:55.741787 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zs9nq" Jan 29 16:25:55 crc kubenswrapper[4886]: I0129 16:25:55.835365 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6hph6" Jan 29 16:25:55 crc kubenswrapper[4886]: I0129 16:25:55.835610 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6hph6" Jan 29 16:25:56 crc kubenswrapper[4886]: I0129 16:25:56.252385 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4jbxl" Jan 29 16:25:56 crc kubenswrapper[4886]: I0129 16:25:56.252440 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4jbxl" Jan 29 16:25:56 crc kubenswrapper[4886]: I0129 16:25:56.335835 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4jbxl" Jan 29 16:25:56 crc kubenswrapper[4886]: I0129 16:25:56.751662 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4jbxl" Jan 29 16:25:56 crc kubenswrapper[4886]: I0129 16:25:56.873443 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6hph6" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="registry-server" probeResult="failure" output=< Jan 29 16:25:56 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 16:25:56 crc kubenswrapper[4886]: > Jan 29 16:25:58 crc kubenswrapper[4886]: I0129 16:25:58.927485 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs9nq"] Jan 29 16:25:58 crc kubenswrapper[4886]: I0129 16:25:58.927857 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zs9nq" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerName="registry-server" containerID="cri-o://3aec1abede58b8faa82b73ab79ff75672caa26cb287c28081010173343956dcc" gracePeriod=2 Jan 29 16:25:59 crc kubenswrapper[4886]: I0129 16:25:59.660569 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:25:59 crc kubenswrapper[4886]: I0129 16:25:59.660947 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:25:59 crc kubenswrapper[4886]: I0129 16:25:59.661134 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:25:59 crc kubenswrapper[4886]: I0129 16:25:59.662018 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8055fe73a1cd8fb346a9937fb9960eb4b8cf16950f5ed88b206f4a30871b1028"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:25:59 crc kubenswrapper[4886]: I0129 16:25:59.662359 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://8055fe73a1cd8fb346a9937fb9960eb4b8cf16950f5ed88b206f4a30871b1028" gracePeriod=600 Jan 29 16:25:59 crc kubenswrapper[4886]: I0129 16:25:59.913993 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jbxl"] Jan 29 16:25:59 crc kubenswrapper[4886]: I0129 16:25:59.914405 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4jbxl" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerName="registry-server" containerID="cri-o://b22791e3d9f615101442f2f7febeb8dc3309e984e4f279202303392053825edf" gracePeriod=2 Jan 29 16:26:01 crc kubenswrapper[4886]: I0129 16:26:01.743584 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="8055fe73a1cd8fb346a9937fb9960eb4b8cf16950f5ed88b206f4a30871b1028" exitCode=0 Jan 29 16:26:01 crc kubenswrapper[4886]: I0129 16:26:01.743700 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"8055fe73a1cd8fb346a9937fb9960eb4b8cf16950f5ed88b206f4a30871b1028"} Jan 29 16:26:01 crc kubenswrapper[4886]: I0129 16:26:01.747662 4886 generic.go:334] "Generic (PLEG): container finished" podID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerID="3aec1abede58b8faa82b73ab79ff75672caa26cb287c28081010173343956dcc" exitCode=0 Jan 29 16:26:01 crc kubenswrapper[4886]: I0129 16:26:01.747704 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs9nq" event={"ID":"dd20d05f-cd0f-401e-b18a-2f89354792d0","Type":"ContainerDied","Data":"3aec1abede58b8faa82b73ab79ff75672caa26cb287c28081010173343956dcc"} Jan 29 16:26:02 crc kubenswrapper[4886]: I0129 16:26:02.688147 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cj9vs" Jan 29 16:26:02 crc kubenswrapper[4886]: I0129 16:26:02.740936 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cj9vs" Jan 29 16:26:02 crc kubenswrapper[4886]: I0129 16:26:02.761522 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbxl" event={"ID":"a710476e-74f4-4f7e-ab94-d2428bade61e","Type":"ContainerDied","Data":"b22791e3d9f615101442f2f7febeb8dc3309e984e4f279202303392053825edf"} Jan 29 16:26:02 crc kubenswrapper[4886]: I0129 16:26:02.761609 4886 generic.go:334] "Generic (PLEG): container finished" podID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerID="b22791e3d9f615101442f2f7febeb8dc3309e984e4f279202303392053825edf" exitCode=0 Jan 29 16:26:02 crc kubenswrapper[4886]: I0129 16:26:02.843248 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xcj6l" Jan 29 16:26:03 crc kubenswrapper[4886]: I0129 16:26:03.316284 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjqm7" Jan 29 16:26:03 crc kubenswrapper[4886]: I0129 16:26:03.353447 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjqm7" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.035623 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs9nq" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.116557 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd20d05f-cd0f-401e-b18a-2f89354792d0-utilities\") pod \"dd20d05f-cd0f-401e-b18a-2f89354792d0\" (UID: \"dd20d05f-cd0f-401e-b18a-2f89354792d0\") " Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.116630 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96zvz\" (UniqueName: \"kubernetes.io/projected/dd20d05f-cd0f-401e-b18a-2f89354792d0-kube-api-access-96zvz\") pod \"dd20d05f-cd0f-401e-b18a-2f89354792d0\" (UID: \"dd20d05f-cd0f-401e-b18a-2f89354792d0\") " Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.116671 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd20d05f-cd0f-401e-b18a-2f89354792d0-catalog-content\") pod \"dd20d05f-cd0f-401e-b18a-2f89354792d0\" (UID: \"dd20d05f-cd0f-401e-b18a-2f89354792d0\") " Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.117742 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd20d05f-cd0f-401e-b18a-2f89354792d0-utilities" (OuterVolumeSpecName: "utilities") pod "dd20d05f-cd0f-401e-b18a-2f89354792d0" (UID: "dd20d05f-cd0f-401e-b18a-2f89354792d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.123074 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd20d05f-cd0f-401e-b18a-2f89354792d0-kube-api-access-96zvz" (OuterVolumeSpecName: "kube-api-access-96zvz") pod "dd20d05f-cd0f-401e-b18a-2f89354792d0" (UID: "dd20d05f-cd0f-401e-b18a-2f89354792d0"). InnerVolumeSpecName "kube-api-access-96zvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.157097 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd20d05f-cd0f-401e-b18a-2f89354792d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd20d05f-cd0f-401e-b18a-2f89354792d0" (UID: "dd20d05f-cd0f-401e-b18a-2f89354792d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.218114 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd20d05f-cd0f-401e-b18a-2f89354792d0-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.218143 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96zvz\" (UniqueName: \"kubernetes.io/projected/dd20d05f-cd0f-401e-b18a-2f89354792d0-kube-api-access-96zvz\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.218153 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd20d05f-cd0f-401e-b18a-2f89354792d0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.347531 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jbxl" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.419493 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a710476e-74f4-4f7e-ab94-d2428bade61e-utilities\") pod \"a710476e-74f4-4f7e-ab94-d2428bade61e\" (UID: \"a710476e-74f4-4f7e-ab94-d2428bade61e\") " Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.419627 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkvmx\" (UniqueName: \"kubernetes.io/projected/a710476e-74f4-4f7e-ab94-d2428bade61e-kube-api-access-kkvmx\") pod \"a710476e-74f4-4f7e-ab94-d2428bade61e\" (UID: \"a710476e-74f4-4f7e-ab94-d2428bade61e\") " Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.419657 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a710476e-74f4-4f7e-ab94-d2428bade61e-catalog-content\") pod \"a710476e-74f4-4f7e-ab94-d2428bade61e\" (UID: \"a710476e-74f4-4f7e-ab94-d2428bade61e\") " Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.421546 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a710476e-74f4-4f7e-ab94-d2428bade61e-utilities" (OuterVolumeSpecName: "utilities") pod "a710476e-74f4-4f7e-ab94-d2428bade61e" (UID: "a710476e-74f4-4f7e-ab94-d2428bade61e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.423056 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a710476e-74f4-4f7e-ab94-d2428bade61e-kube-api-access-kkvmx" (OuterVolumeSpecName: "kube-api-access-kkvmx") pod "a710476e-74f4-4f7e-ab94-d2428bade61e" (UID: "a710476e-74f4-4f7e-ab94-d2428bade61e"). InnerVolumeSpecName "kube-api-access-kkvmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.521793 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkvmx\" (UniqueName: \"kubernetes.io/projected/a710476e-74f4-4f7e-ab94-d2428bade61e-kube-api-access-kkvmx\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.521872 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a710476e-74f4-4f7e-ab94-d2428bade61e-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.781486 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"96fb4b3b0684eec0f8e815c984345d77640459634c9d28cbf8434505ebf34891"} Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.783743 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs9nq" event={"ID":"dd20d05f-cd0f-401e-b18a-2f89354792d0","Type":"ContainerDied","Data":"3a14ec6fcf7e574cbb7bb1e550a27abeaf3193fe3131800ddd76cb089990f9d3"} Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.783780 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs9nq" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.783781 4886 scope.go:117] "RemoveContainer" containerID="3aec1abede58b8faa82b73ab79ff75672caa26cb287c28081010173343956dcc" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.786506 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbxl" event={"ID":"a710476e-74f4-4f7e-ab94-d2428bade61e","Type":"ContainerDied","Data":"1b7c7ac95d6deb14d58d68d8614d14207966e7b0c294b7297faa9446ddd99953"} Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.786617 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jbxl" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.809241 4886 scope.go:117] "RemoveContainer" containerID="2838a0bdd722f9e7f7de971f3ef56f281b5be560ab82b4ce2dc92224cbf0042f" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.813409 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a710476e-74f4-4f7e-ab94-d2428bade61e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a710476e-74f4-4f7e-ab94-d2428bade61e" (UID: "a710476e-74f4-4f7e-ab94-d2428bade61e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.814766 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs9nq"] Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.818702 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs9nq"] Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.826713 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a710476e-74f4-4f7e-ab94-d2428bade61e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.840021 4886 scope.go:117] "RemoveContainer" containerID="993aeae10b51b9ba867b7ad588cb7c6e7651b0c3345b073059af7a58ad9790c3" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.864902 4886 scope.go:117] "RemoveContainer" containerID="b22791e3d9f615101442f2f7febeb8dc3309e984e4f279202303392053825edf" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.932745 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mpttg"] Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.938906 4886 scope.go:117] "RemoveContainer" containerID="f05d2d4560320194303a8b36647eaf5baeadb47a7241ddfee9698d44fa4aaa4c" Jan 29 16:26:04 crc kubenswrapper[4886]: I0129 16:26:04.959636 4886 scope.go:117] "RemoveContainer" containerID="542d74b470422150123685d3edf24455da6a5470e04d40768b0ed7b1e8d27bc4" Jan 29 16:26:05 crc kubenswrapper[4886]: I0129 16:26:05.114645 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jbxl"] Jan 29 16:26:05 crc kubenswrapper[4886]: I0129 16:26:05.114690 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjqm7"] Jan 29 16:26:05 crc kubenswrapper[4886]: I0129 16:26:05.114895 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjqm7" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="registry-server" containerID="cri-o://b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb" gracePeriod=2 Jan 29 16:26:05 crc kubenswrapper[4886]: I0129 16:26:05.124534 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4jbxl"] Jan 29 16:26:05 crc kubenswrapper[4886]: I0129 16:26:05.887483 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6hph6" Jan 29 16:26:05 crc kubenswrapper[4886]: I0129 16:26:05.930982 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6hph6" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.620960 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" path="/var/lib/kubelet/pods/a710476e-74f4-4f7e-ab94-d2428bade61e/volumes" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.632110 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" path="/var/lib/kubelet/pods/dd20d05f-cd0f-401e-b18a-2f89354792d0/volumes" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.780284 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjqm7" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.811341 4886 generic.go:334] "Generic (PLEG): container finished" podID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerID="b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb" exitCode=0 Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.811409 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjqm7" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.811430 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjqm7" event={"ID":"057806c7-b5ca-43df-91c7-30a2dc58c011","Type":"ContainerDied","Data":"b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb"} Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.811484 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjqm7" event={"ID":"057806c7-b5ca-43df-91c7-30a2dc58c011","Type":"ContainerDied","Data":"b80b8058bdb8fd4eef83ffeccee0a93733e929325e740b25b1e55fdba478cf66"} Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.811504 4886 scope.go:117] "RemoveContainer" containerID="b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.829208 4886 scope.go:117] "RemoveContainer" containerID="307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.843804 4886 scope.go:117] "RemoveContainer" containerID="a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.851313 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs29d\" (UniqueName: \"kubernetes.io/projected/057806c7-b5ca-43df-91c7-30a2dc58c011-kube-api-access-gs29d\") pod \"057806c7-b5ca-43df-91c7-30a2dc58c011\" (UID: \"057806c7-b5ca-43df-91c7-30a2dc58c011\") " Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.851424 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057806c7-b5ca-43df-91c7-30a2dc58c011-catalog-content\") pod \"057806c7-b5ca-43df-91c7-30a2dc58c011\" (UID: \"057806c7-b5ca-43df-91c7-30a2dc58c011\") " Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.851513 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057806c7-b5ca-43df-91c7-30a2dc58c011-utilities\") pod \"057806c7-b5ca-43df-91c7-30a2dc58c011\" (UID: \"057806c7-b5ca-43df-91c7-30a2dc58c011\") " Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.853119 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057806c7-b5ca-43df-91c7-30a2dc58c011-utilities" (OuterVolumeSpecName: "utilities") pod "057806c7-b5ca-43df-91c7-30a2dc58c011" (UID: "057806c7-b5ca-43df-91c7-30a2dc58c011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.858277 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057806c7-b5ca-43df-91c7-30a2dc58c011-kube-api-access-gs29d" (OuterVolumeSpecName: "kube-api-access-gs29d") pod "057806c7-b5ca-43df-91c7-30a2dc58c011" (UID: "057806c7-b5ca-43df-91c7-30a2dc58c011"). InnerVolumeSpecName "kube-api-access-gs29d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.863599 4886 scope.go:117] "RemoveContainer" containerID="b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb" Jan 29 16:26:06 crc kubenswrapper[4886]: E0129 16:26:06.864126 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb\": container with ID starting with b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb not found: ID does not exist" containerID="b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.864162 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb"} err="failed to get container status \"b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb\": rpc error: code = NotFound desc = could not find container \"b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb\": container with ID starting with b7cd9c63904e404fe9446a1ff9402be281118c2ffb2023c64847b10d15f887eb not found: ID does not exist" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.864188 4886 scope.go:117] "RemoveContainer" containerID="307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc" Jan 29 16:26:06 crc kubenswrapper[4886]: E0129 16:26:06.864548 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc\": container with ID starting with 307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc not found: ID does not exist" containerID="307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.864573 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc"} err="failed to get container status \"307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc\": rpc error: code = NotFound desc = could not find container \"307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc\": container with ID starting with 307811e0c4081bf12c363b76eff5629bd7ac5901479db6027a6bd50e6cae2ccc not found: ID does not exist" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.864586 4886 scope.go:117] "RemoveContainer" containerID="a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312" Jan 29 16:26:06 crc kubenswrapper[4886]: E0129 16:26:06.864844 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312\": container with ID starting with a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312 not found: ID does not exist" containerID="a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.864879 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312"} err="failed to get container status \"a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312\": rpc error: code = NotFound desc = could not find container \"a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312\": container with ID starting with a2a6cbc6c2cee221b3e74aba38fce6c75da0d8e08f7766fa4a0eb1f485c41312 not found: ID does not exist" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.902031 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057806c7-b5ca-43df-91c7-30a2dc58c011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "057806c7-b5ca-43df-91c7-30a2dc58c011" (UID: "057806c7-b5ca-43df-91c7-30a2dc58c011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.952733 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057806c7-b5ca-43df-91c7-30a2dc58c011-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.952776 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs29d\" (UniqueName: \"kubernetes.io/projected/057806c7-b5ca-43df-91c7-30a2dc58c011-kube-api-access-gs29d\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:06 crc kubenswrapper[4886]: I0129 16:26:06.952789 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057806c7-b5ca-43df-91c7-30a2dc58c011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:07 crc kubenswrapper[4886]: I0129 16:26:07.144630 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjqm7"] Jan 29 16:26:07 crc kubenswrapper[4886]: I0129 16:26:07.148262 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjqm7"] Jan 29 16:26:08 crc kubenswrapper[4886]: I0129 16:26:08.625747 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" path="/var/lib/kubelet/pods/057806c7-b5ca-43df-91c7-30a2dc58c011/volumes" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.884194 4886 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.884944 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a50cf2f-b08d-4f5c-a364-d939d83aa205" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.884957 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a50cf2f-b08d-4f5c-a364-d939d83aa205" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.884966 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.884972 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.884981 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerName="extract-utilities" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.884987 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerName="extract-utilities" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.884996 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerName="extract-content" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885004 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerName="extract-content" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885014 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885020 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885030 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerName="extract-content" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885035 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerName="extract-content" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885045 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885050 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885058 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a50cf2f-b08d-4f5c-a364-d939d83aa205" containerName="extract-utilities" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885064 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a50cf2f-b08d-4f5c-a364-d939d83aa205" containerName="extract-utilities" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885071 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="extract-utilities" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885077 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="extract-utilities" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885088 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="extract-content" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885094 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="extract-content" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885103 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c03df6-46f4-4ad6-b8ea-7753cceb381c" containerName="pruner" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885108 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c03df6-46f4-4ad6-b8ea-7753cceb381c" containerName="pruner" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885115 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerName="extract-utilities" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885120 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerName="extract-utilities" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.885127 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a50cf2f-b08d-4f5c-a364-d939d83aa205" containerName="extract-content" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885133 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a50cf2f-b08d-4f5c-a364-d939d83aa205" containerName="extract-content" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885225 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c03df6-46f4-4ad6-b8ea-7753cceb381c" containerName="pruner" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885237 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd20d05f-cd0f-401e-b18a-2f89354792d0" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885243 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="057806c7-b5ca-43df-91c7-30a2dc58c011" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885253 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a710476e-74f4-4f7e-ab94-d2428bade61e" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885261 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a50cf2f-b08d-4f5c-a364-d939d83aa205" containerName="registry-server" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885557 4886 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885774 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.885918 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc" gracePeriod=15 Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.886005 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930" gracePeriod=15 Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.886058 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749" gracePeriod=15 Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.886080 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981" gracePeriod=15 Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.886387 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff" gracePeriod=15 Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888107 4886 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.888254 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888265 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.888275 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888281 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.888288 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888293 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.888300 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888306 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.888323 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888349 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.888358 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888364 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.888376 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888382 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888462 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888471 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888480 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888487 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888496 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888507 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888514 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 16:26:11 crc kubenswrapper[4886]: E0129 16:26:11.888593 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.888599 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.911526 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.911604 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.911694 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.912242 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.912397 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.912457 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.913293 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.913372 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:11 crc kubenswrapper[4886]: I0129 16:26:11.921500 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014217 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014529 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014570 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014589 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014578 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014621 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014636 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014669 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014669 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014692 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014746 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014753 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014780 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014789 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.014817 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.219867 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:26:12 crc kubenswrapper[4886]: W0129 16:26:12.241269 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-0386712ddaeec2f4509b379ed96a080a64a88a43a08f6f7600c59c97f88bb567 WatchSource:0}: Error finding container 0386712ddaeec2f4509b379ed96a080a64a88a43a08f6f7600c59c97f88bb567: Status 404 returned error can't find the container with id 0386712ddaeec2f4509b379ed96a080a64a88a43a08f6f7600c59c97f88bb567 Jan 29 16:26:12 crc kubenswrapper[4886]: E0129 16:26:12.245001 4886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.174:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f4062ef2fd167 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 16:26:12.243755367 +0000 UTC m=+255.152474659,LastTimestamp:2026-01-29 16:26:12.243755367 +0000 UTC m=+255.152474659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 29 16:26:12 crc kubenswrapper[4886]: E0129 16:26:12.439213 4886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.174:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f4062ef2fd167 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 16:26:12.243755367 +0000 UTC m=+255.152474659,LastTimestamp:2026-01-29 16:26:12.243755367 +0000 UTC m=+255.152474659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.843595 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e338e481af24aecd5ce5485aecf3d5729c1fbb23b68efbbc211fd833fc6aa1fa"} Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.843659 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0386712ddaeec2f4509b379ed96a080a64a88a43a08f6f7600c59c97f88bb567"} Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.844303 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.845080 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.846064 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.846624 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930" exitCode=0 Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.846643 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff" exitCode=0 Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.846653 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749" exitCode=0 Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.846661 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981" exitCode=2 Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.846710 4886 scope.go:117] "RemoveContainer" containerID="8bbfe403372c663d59079e8c4111846693950b0eca93a07be737c20775395f88" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.848203 4886 generic.go:334] "Generic (PLEG): container finished" podID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" containerID="c343d7cf431e697a16a8317ad5a319272ba2d6db4aeee174cb506961f6519cb9" exitCode=0 Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.848244 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9027a6d8-0cac-4276-b722-08c3a99c6cf9","Type":"ContainerDied","Data":"c343d7cf431e697a16a8317ad5a319272ba2d6db4aeee174cb506961f6519cb9"} Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.848835 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:12 crc kubenswrapper[4886]: I0129 16:26:12.849223 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:13 crc kubenswrapper[4886]: I0129 16:26:13.855976 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.157989 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.158668 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.158929 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.257154 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.258077 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.258746 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.259212 4886 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.259632 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.341561 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9027a6d8-0cac-4276-b722-08c3a99c6cf9-kubelet-dir\") pod \"9027a6d8-0cac-4276-b722-08c3a99c6cf9\" (UID: \"9027a6d8-0cac-4276-b722-08c3a99c6cf9\") " Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.341640 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9027a6d8-0cac-4276-b722-08c3a99c6cf9-kube-api-access\") pod \"9027a6d8-0cac-4276-b722-08c3a99c6cf9\" (UID: \"9027a6d8-0cac-4276-b722-08c3a99c6cf9\") " Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.341691 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9027a6d8-0cac-4276-b722-08c3a99c6cf9-var-lock\") pod \"9027a6d8-0cac-4276-b722-08c3a99c6cf9\" (UID: \"9027a6d8-0cac-4276-b722-08c3a99c6cf9\") " Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.341706 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9027a6d8-0cac-4276-b722-08c3a99c6cf9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9027a6d8-0cac-4276-b722-08c3a99c6cf9" (UID: "9027a6d8-0cac-4276-b722-08c3a99c6cf9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.341848 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9027a6d8-0cac-4276-b722-08c3a99c6cf9-var-lock" (OuterVolumeSpecName: "var-lock") pod "9027a6d8-0cac-4276-b722-08c3a99c6cf9" (UID: "9027a6d8-0cac-4276-b722-08c3a99c6cf9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.342080 4886 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9027a6d8-0cac-4276-b722-08c3a99c6cf9-var-lock\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.342103 4886 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9027a6d8-0cac-4276-b722-08c3a99c6cf9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.347488 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9027a6d8-0cac-4276-b722-08c3a99c6cf9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9027a6d8-0cac-4276-b722-08c3a99c6cf9" (UID: "9027a6d8-0cac-4276-b722-08c3a99c6cf9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.443569 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.443774 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.443818 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.443867 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.443872 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.443971 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.444146 4886 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.444166 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9027a6d8-0cac-4276-b722-08c3a99c6cf9-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.444178 4886 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.444192 4886 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.622580 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.865934 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.867279 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc" exitCode=0 Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.867378 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.867380 4886 scope.go:117] "RemoveContainer" containerID="2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.868045 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.868258 4886 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.868494 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.868674 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9027a6d8-0cac-4276-b722-08c3a99c6cf9","Type":"ContainerDied","Data":"c78b07716ffb8a4c7dfa38504f62f4211f74dab5deb70928233e82d0c002e686"} Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.868705 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.868696 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c78b07716ffb8a4c7dfa38504f62f4211f74dab5deb70928233e82d0c002e686" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.871082 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.871308 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.871739 4886 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.873082 4886 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.873377 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.873686 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.881620 4886 scope.go:117] "RemoveContainer" containerID="40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.897750 4886 scope.go:117] "RemoveContainer" containerID="2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.912553 4886 scope.go:117] "RemoveContainer" containerID="ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.926167 4886 scope.go:117] "RemoveContainer" containerID="b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.940880 4886 scope.go:117] "RemoveContainer" containerID="92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.957314 4886 scope.go:117] "RemoveContainer" containerID="2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930" Jan 29 16:26:14 crc kubenswrapper[4886]: E0129 16:26:14.957760 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930\": container with ID starting with 2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930 not found: ID does not exist" containerID="2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.957901 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930"} err="failed to get container status \"2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930\": rpc error: code = NotFound desc = could not find container \"2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930\": container with ID starting with 2d2126e0e150d4a578976def8715d596ae31d0561b0eaa832061d4fb86a8a930 not found: ID does not exist" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.958008 4886 scope.go:117] "RemoveContainer" containerID="40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff" Jan 29 16:26:14 crc kubenswrapper[4886]: E0129 16:26:14.958522 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff\": container with ID starting with 40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff not found: ID does not exist" containerID="40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.958557 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff"} err="failed to get container status \"40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff\": rpc error: code = NotFound desc = could not find container \"40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff\": container with ID starting with 40c80ff4d9a5e63764163d3748d2ade63000eb35bda512cf37a51c9f8b805fff not found: ID does not exist" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.958583 4886 scope.go:117] "RemoveContainer" containerID="2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749" Jan 29 16:26:14 crc kubenswrapper[4886]: E0129 16:26:14.958868 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749\": container with ID starting with 2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749 not found: ID does not exist" containerID="2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.958891 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749"} err="failed to get container status \"2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749\": rpc error: code = NotFound desc = could not find container \"2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749\": container with ID starting with 2aaea10d8ea0e36361380eb0c535a3fdc5b51d62e499adcbc5d57558b58e8749 not found: ID does not exist" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.958927 4886 scope.go:117] "RemoveContainer" containerID="ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981" Jan 29 16:26:14 crc kubenswrapper[4886]: E0129 16:26:14.959301 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981\": container with ID starting with ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981 not found: ID does not exist" containerID="ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.959415 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981"} err="failed to get container status \"ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981\": rpc error: code = NotFound desc = could not find container \"ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981\": container with ID starting with ad6238fc03a0e7aa722791bda44bbaeca8a7269580529a4dd5d62cf0d1e39981 not found: ID does not exist" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.959500 4886 scope.go:117] "RemoveContainer" containerID="b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc" Jan 29 16:26:14 crc kubenswrapper[4886]: E0129 16:26:14.959843 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc\": container with ID starting with b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc not found: ID does not exist" containerID="b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.959878 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc"} err="failed to get container status \"b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc\": rpc error: code = NotFound desc = could not find container \"b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc\": container with ID starting with b9f3a2de52a936816a5d1e98920861b324b9980bf8a60336caab039ebbd563cc not found: ID does not exist" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.959899 4886 scope.go:117] "RemoveContainer" containerID="92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08" Jan 29 16:26:14 crc kubenswrapper[4886]: E0129 16:26:14.960149 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08\": container with ID starting with 92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08 not found: ID does not exist" containerID="92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08" Jan 29 16:26:14 crc kubenswrapper[4886]: I0129 16:26:14.960174 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08"} err="failed to get container status \"92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08\": rpc error: code = NotFound desc = could not find container \"92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08\": container with ID starting with 92a0f5389357492bf461db75ffb1ced7fa106c160b16e7e701f99f90a0c8fb08 not found: ID does not exist" Jan 29 16:26:15 crc kubenswrapper[4886]: E0129 16:26:15.205023 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:15 crc kubenswrapper[4886]: E0129 16:26:15.205770 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:15 crc kubenswrapper[4886]: E0129 16:26:15.206271 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:15 crc kubenswrapper[4886]: E0129 16:26:15.206716 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:15 crc kubenswrapper[4886]: E0129 16:26:15.207097 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:15 crc kubenswrapper[4886]: I0129 16:26:15.207140 4886 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 29 16:26:15 crc kubenswrapper[4886]: E0129 16:26:15.207470 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" interval="200ms" Jan 29 16:26:15 crc kubenswrapper[4886]: E0129 16:26:15.408035 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" interval="400ms" Jan 29 16:26:15 crc kubenswrapper[4886]: E0129 16:26:15.809718 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" interval="800ms" Jan 29 16:26:16 crc kubenswrapper[4886]: E0129 16:26:16.610158 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" interval="1.6s" Jan 29 16:26:18 crc kubenswrapper[4886]: E0129 16:26:18.211106 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" interval="3.2s" Jan 29 16:26:18 crc kubenswrapper[4886]: I0129 16:26:18.619528 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:18 crc kubenswrapper[4886]: I0129 16:26:18.620853 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:21 crc kubenswrapper[4886]: E0129 16:26:21.412037 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.174:6443: connect: connection refused" interval="6.4s" Jan 29 16:26:22 crc kubenswrapper[4886]: E0129 16:26:22.440099 4886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.174:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f4062ef2fd167 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 16:26:12.243755367 +0000 UTC m=+255.152474659,LastTimestamp:2026-01-29 16:26:12.243755367 +0000 UTC m=+255.152474659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.614294 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.615628 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.616381 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.634989 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.635062 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:25 crc kubenswrapper[4886]: E0129 16:26:25.635913 4886 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.636471 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.933471 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"576bc78e4b76f0ff28a3f03c5d234ce586e9d3fb6eb00dbb7c575ad0144179c4"} Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.937047 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.937109 4886 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a370948657cae25c181170bc42e45d896e01469cb4079ad6ed412210527edb08" exitCode=1 Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.937145 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a370948657cae25c181170bc42e45d896e01469cb4079ad6ed412210527edb08"} Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.937710 4886 scope.go:117] "RemoveContainer" containerID="a370948657cae25c181170bc42e45d896e01469cb4079ad6ed412210527edb08" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.937974 4886 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.938511 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:25 crc kubenswrapper[4886]: I0129 16:26:25.939017 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.680530 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.947661 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.948130 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4c54ca3c104e6bbe0325be1c3777b09d70215a073d7aa15018d297a353e4dbc6"} Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.949235 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.949991 4886 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.950534 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.950698 4886 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8dad781d4af802765bf506aa6cadb462999deeecf1dcbd5cb3f76ab9caeebeb9" exitCode=0 Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.950754 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8dad781d4af802765bf506aa6cadb462999deeecf1dcbd5cb3f76ab9caeebeb9"} Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.951099 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.951135 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:26 crc kubenswrapper[4886]: E0129 16:26:26.951587 4886 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.951638 4886 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.952220 4886 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:26 crc kubenswrapper[4886]: I0129 16:26:26.952705 4886 status_manager.go:851] "Failed to get status for pod" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.174:6443: connect: connection refused" Jan 29 16:26:27 crc kubenswrapper[4886]: I0129 16:26:27.958864 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4287cf42783b6750073994717fd7d568e50f9da2a07db5b726e2f78e4c469e77"} Jan 29 16:26:27 crc kubenswrapper[4886]: I0129 16:26:27.958900 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"96b66c9fbf5375b57b3c97dec37f824e12eac08fed2f97956b22ec7cc45c44f4"} Jan 29 16:26:27 crc kubenswrapper[4886]: I0129 16:26:27.958910 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9850f529a11e352c6246aa0a71bfec5294fdf9c2bc6c8a9fe2aa9af6f6a37ee7"} Jan 29 16:26:28 crc kubenswrapper[4886]: I0129 16:26:28.966757 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b33b62cf638211f8d3d3f038f7e733b3cfe70aa3fa225f193239f1d4b3b96041"} Jan 29 16:26:28 crc kubenswrapper[4886]: I0129 16:26:28.966810 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9b93df2811c25caa29409580b2d942f36f17760a1726f973735a28c47c2a43b8"} Jan 29 16:26:28 crc kubenswrapper[4886]: I0129 16:26:28.966962 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:28 crc kubenswrapper[4886]: I0129 16:26:28.967040 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:28 crc kubenswrapper[4886]: I0129 16:26:28.967065 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:29 crc kubenswrapper[4886]: I0129 16:26:29.957656 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mpttg" podUID="b947565b-6a14-4bbd-881e-e82c33ca3a3b" containerName="oauth-openshift" containerID="cri-o://8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a" gracePeriod=15 Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.297018 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mpttg" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.446888 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-serving-cert\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.446959 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-service-ca\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447000 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b947565b-6a14-4bbd-881e-e82c33ca3a3b-audit-dir\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447069 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-idp-0-file-data\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447094 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-cliconfig\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447130 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-ocp-branding-template\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447170 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-session\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447197 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-provider-selection\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447234 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-router-certs\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447274 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-error\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447301 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-trusted-ca-bundle\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447350 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-audit-policies\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447381 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-login\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447419 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqjmr\" (UniqueName: \"kubernetes.io/projected/b947565b-6a14-4bbd-881e-e82c33ca3a3b-kube-api-access-hqjmr\") pod \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\" (UID: \"b947565b-6a14-4bbd-881e-e82c33ca3a3b\") " Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.447172 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b947565b-6a14-4bbd-881e-e82c33ca3a3b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.448999 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.449047 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.449469 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.450551 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.455506 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.456015 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b947565b-6a14-4bbd-881e-e82c33ca3a3b-kube-api-access-hqjmr" (OuterVolumeSpecName: "kube-api-access-hqjmr") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "kube-api-access-hqjmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.457403 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.457565 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.458383 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.458897 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.459190 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.459509 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.461224 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b947565b-6a14-4bbd-881e-e82c33ca3a3b" (UID: "b947565b-6a14-4bbd-881e-e82c33ca3a3b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548621 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548688 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548718 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548737 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548758 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548784 4886 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548806 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548826 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqjmr\" (UniqueName: \"kubernetes.io/projected/b947565b-6a14-4bbd-881e-e82c33ca3a3b-kube-api-access-hqjmr\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548844 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548857 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548873 4886 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b947565b-6a14-4bbd-881e-e82c33ca3a3b-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548893 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548911 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.548929 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b947565b-6a14-4bbd-881e-e82c33ca3a3b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.636921 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.637003 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.652097 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.994609 4886 generic.go:334] "Generic (PLEG): container finished" podID="b947565b-6a14-4bbd-881e-e82c33ca3a3b" containerID="8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a" exitCode=0 Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.995249 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mpttg" event={"ID":"b947565b-6a14-4bbd-881e-e82c33ca3a3b","Type":"ContainerDied","Data":"8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a"} Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.995623 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mpttg" event={"ID":"b947565b-6a14-4bbd-881e-e82c33ca3a3b","Type":"ContainerDied","Data":"cb33ac24972d3d5dba165317920577129d54d60d3420d9aec798c5982a6dac0a"} Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.995834 4886 scope.go:117] "RemoveContainer" containerID="8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a" Jan 29 16:26:30 crc kubenswrapper[4886]: I0129 16:26:30.996322 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mpttg" Jan 29 16:26:31 crc kubenswrapper[4886]: I0129 16:26:31.023551 4886 scope.go:117] "RemoveContainer" containerID="8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a" Jan 29 16:26:31 crc kubenswrapper[4886]: E0129 16:26:31.024072 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a\": container with ID starting with 8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a not found: ID does not exist" containerID="8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a" Jan 29 16:26:31 crc kubenswrapper[4886]: I0129 16:26:31.024167 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a"} err="failed to get container status \"8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a\": rpc error: code = NotFound desc = could not find container \"8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a\": container with ID starting with 8bc0819e4d3779242ef0e41d51afff359c9061460b45623abee6c85c9020ca9a not found: ID does not exist" Jan 29 16:26:33 crc kubenswrapper[4886]: I0129 16:26:33.077673 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 16:26:33 crc kubenswrapper[4886]: I0129 16:26:33.077779 4886 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 29 16:26:33 crc kubenswrapper[4886]: I0129 16:26:33.078644 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 29 16:26:33 crc kubenswrapper[4886]: I0129 16:26:33.978373 4886 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:34 crc kubenswrapper[4886]: I0129 16:26:34.015112 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:34 crc kubenswrapper[4886]: I0129 16:26:34.015148 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:34 crc kubenswrapper[4886]: I0129 16:26:34.018655 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:34 crc kubenswrapper[4886]: I0129 16:26:34.021156 4886 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e5a72ce7-a4db-4f3d-ba76-57bd63d6dba2" Jan 29 16:26:35 crc kubenswrapper[4886]: I0129 16:26:35.020385 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:35 crc kubenswrapper[4886]: I0129 16:26:35.020412 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:36 crc kubenswrapper[4886]: I0129 16:26:36.680493 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 16:26:38 crc kubenswrapper[4886]: I0129 16:26:38.643503 4886 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e5a72ce7-a4db-4f3d-ba76-57bd63d6dba2" Jan 29 16:26:43 crc kubenswrapper[4886]: I0129 16:26:43.019819 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 29 16:26:43 crc kubenswrapper[4886]: I0129 16:26:43.078676 4886 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 29 16:26:43 crc kubenswrapper[4886]: I0129 16:26:43.078743 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 29 16:26:43 crc kubenswrapper[4886]: I0129 16:26:43.812541 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 29 16:26:44 crc kubenswrapper[4886]: I0129 16:26:44.098373 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 16:26:44 crc kubenswrapper[4886]: I0129 16:26:44.453658 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 29 16:26:44 crc kubenswrapper[4886]: I0129 16:26:44.468015 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 16:26:44 crc kubenswrapper[4886]: I0129 16:26:44.535866 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 29 16:26:44 crc kubenswrapper[4886]: I0129 16:26:44.890707 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 29 16:26:44 crc kubenswrapper[4886]: I0129 16:26:44.906868 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 29 16:26:45 crc kubenswrapper[4886]: I0129 16:26:45.005390 4886 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 29 16:26:45 crc kubenswrapper[4886]: I0129 16:26:45.097250 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 29 16:26:45 crc kubenswrapper[4886]: I0129 16:26:45.251465 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 29 16:26:45 crc kubenswrapper[4886]: I0129 16:26:45.653704 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 29 16:26:45 crc kubenswrapper[4886]: I0129 16:26:45.700260 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 16:26:45 crc kubenswrapper[4886]: I0129 16:26:45.765113 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 16:26:45 crc kubenswrapper[4886]: I0129 16:26:45.861661 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 29 16:26:46 crc kubenswrapper[4886]: I0129 16:26:46.338378 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 29 16:26:46 crc kubenswrapper[4886]: I0129 16:26:46.480753 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 29 16:26:46 crc kubenswrapper[4886]: I0129 16:26:46.647143 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 29 16:26:46 crc kubenswrapper[4886]: I0129 16:26:46.666892 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 29 16:26:46 crc kubenswrapper[4886]: I0129 16:26:46.705681 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 29 16:26:46 crc kubenswrapper[4886]: I0129 16:26:46.764489 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 29 16:26:46 crc kubenswrapper[4886]: I0129 16:26:46.831305 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.165321 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.198691 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.259292 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.266289 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.365322 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.462277 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.675205 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.704025 4886 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.902445 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.960393 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.968875 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 29 16:26:47 crc kubenswrapper[4886]: I0129 16:26:47.969947 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.065863 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.106964 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.120250 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.241238 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.289573 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.302057 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.493352 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.573495 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.633461 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.657793 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.786187 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.876399 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.876597 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.927597 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.965518 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 29 16:26:48 crc kubenswrapper[4886]: I0129 16:26:48.970822 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.103272 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.121828 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.128556 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.148975 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.170996 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.174456 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.182532 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.184423 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.377591 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.405710 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.437028 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.479591 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.498731 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.548994 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.623445 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.653796 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.677887 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.685870 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.689886 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.744994 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.887305 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 29 16:26:49 crc kubenswrapper[4886]: I0129 16:26:49.914322 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.057850 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.127241 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.128997 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.140300 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.152435 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.167360 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.186284 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.248785 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.303431 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.385617 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.388015 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.502524 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.669402 4886 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.691942 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.725852 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.840632 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.841722 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.876713 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.904678 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.927372 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 29 16:26:50 crc kubenswrapper[4886]: I0129 16:26:50.959497 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.002818 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.151461 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.154792 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.199265 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.297950 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.332622 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.362147 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.413371 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.415962 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.420859 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.427049 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.540839 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.580818 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.614978 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.702399 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.709604 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.864093 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.869209 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 16:26:51 crc kubenswrapper[4886]: I0129 16:26:51.972416 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.041740 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.071159 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.106879 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.169627 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.179611 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.188782 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.192150 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.195294 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.285505 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.328841 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.415663 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.419464 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.553021 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.577980 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.769712 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 29 16:26:52 crc kubenswrapper[4886]: I0129 16:26:52.856230 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.014917 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.035373 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.059558 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.077954 4886 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.078050 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.078122 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.079128 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"4c54ca3c104e6bbe0325be1c3777b09d70215a073d7aa15018d297a353e4dbc6"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.079442 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://4c54ca3c104e6bbe0325be1c3777b09d70215a073d7aa15018d297a353e4dbc6" gracePeriod=30 Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.119950 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.162968 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.196429 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.390459 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.439093 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.462554 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.466750 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.620250 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.630669 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.749033 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.807965 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.859520 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 29 16:26:53 crc kubenswrapper[4886]: I0129 16:26:53.868616 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:53.999923 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.016240 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.100495 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.129086 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.133668 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.230798 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.237597 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.282202 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.293168 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.317482 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.357408 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.380988 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.391991 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.438317 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.499833 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.512225 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.526350 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.533979 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.673659 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.706353 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.706381 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.713942 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.724810 4886 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.863169 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 29 16:26:54 crc kubenswrapper[4886]: I0129 16:26:54.966908 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.003169 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.024163 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.042427 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.047878 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.131066 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.160883 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.230677 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.247755 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.288644 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.339230 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.399580 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.461151 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.543917 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.611590 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.733799 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.747720 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.771708 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.844163 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.882224 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.923935 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 29 16:26:55 crc kubenswrapper[4886]: I0129 16:26:55.950053 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.099265 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.136880 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.215174 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.264097 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.404140 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.463051 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.545780 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.558807 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.699022 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.703230 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.754311 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.883074 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.953886 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.983212 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 29 16:26:56 crc kubenswrapper[4886]: I0129 16:26:56.999616 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 29 16:26:57 crc kubenswrapper[4886]: I0129 16:26:57.053145 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 29 16:26:57 crc kubenswrapper[4886]: I0129 16:26:57.433875 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 29 16:26:57 crc kubenswrapper[4886]: I0129 16:26:57.726017 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 29 16:26:57 crc kubenswrapper[4886]: I0129 16:26:57.820563 4886 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 29 16:26:57 crc kubenswrapper[4886]: I0129 16:26:57.827158 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 29 16:26:57 crc kubenswrapper[4886]: I0129 16:26:57.851366 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 29 16:26:57 crc kubenswrapper[4886]: I0129 16:26:57.952659 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.022135 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.113772 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.127453 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.208705 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.315779 4886 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.374041 4886 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.381745 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=47.381709368 podStartE2EDuration="47.381709368s" podCreationTimestamp="2026-01-29 16:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:26:33.653702989 +0000 UTC m=+276.562422261" watchObservedRunningTime="2026-01-29 16:26:58.381709368 +0000 UTC m=+301.290428690" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.386370 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-mpttg"] Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.386449 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg","openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 16:26:58 crc kubenswrapper[4886]: E0129 16:26:58.386725 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b947565b-6a14-4bbd-881e-e82c33ca3a3b" containerName="oauth-openshift" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.386977 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b947565b-6a14-4bbd-881e-e82c33ca3a3b" containerName="oauth-openshift" Jan 29 16:26:58 crc kubenswrapper[4886]: E0129 16:26:58.387040 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" containerName="installer" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.387062 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" containerName="installer" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.387413 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.387449 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9630c976-1bbd-4f14-b4c7-fc0436ca3705" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.388740 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b947565b-6a14-4bbd-881e-e82c33ca3a3b" containerName="oauth-openshift" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.388796 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9027a6d8-0cac-4276-b722-08c3a99c6cf9" containerName="installer" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.389857 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.395588 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.395825 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.395760 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.396580 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.396834 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.397909 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.398517 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.398745 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.399027 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.399578 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.399595 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.404271 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.404620 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.422091 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.422160 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.429590 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.453782 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.453761451 podStartE2EDuration="25.453761451s" podCreationTimestamp="2026-01-29 16:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:26:58.442808599 +0000 UTC m=+301.351527971" watchObservedRunningTime="2026-01-29 16:26:58.453761451 +0000 UTC m=+301.362480733" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.490216 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.494517 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.498142 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515157 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-error\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515201 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515224 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-login\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515246 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-dir\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515261 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-869nb\" (UniqueName: \"kubernetes.io/projected/92af746d-c60d-46a4-9be0-0ad28882ac0e-kube-api-access-869nb\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515282 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515305 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515515 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515591 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515641 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515670 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515758 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515809 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-session\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.515841 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-policies\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.595081 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618190 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-session\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618489 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618519 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-policies\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618547 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-error\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618563 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618586 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-login\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618607 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-dir\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618622 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-869nb\" (UniqueName: \"kubernetes.io/projected/92af746d-c60d-46a4-9be0-0ad28882ac0e-kube-api-access-869nb\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618639 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618662 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618684 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618718 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618736 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618758 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.618969 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-dir\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.619845 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.622063 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.622918 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.622915 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.623384 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.623447 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.623575 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.623695 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.625394 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.626977 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.627240 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b947565b-6a14-4bbd-881e-e82c33ca3a3b" path="/var/lib/kubelet/pods/b947565b-6a14-4bbd-881e-e82c33ca3a3b/volumes" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.630373 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.630606 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.631295 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-policies\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.635258 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.635878 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-session\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.636041 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.636980 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.638944 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-error\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.639254 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-login\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.639827 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.639956 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.639968 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.643880 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.650589 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.651993 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.653259 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.662003 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.677400 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-869nb\" (UniqueName: \"kubernetes.io/projected/92af746d-c60d-46a4-9be0-0ad28882ac0e-kube-api-access-869nb\") pod \"oauth-openshift-9fbfc7dc4-r9gqg\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.735949 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.744821 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.747593 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.784539 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 29 16:26:58 crc kubenswrapper[4886]: I0129 16:26:58.814193 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.100530 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.177797 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg"] Jan 29 16:26:59 crc kubenswrapper[4886]: W0129 16:26:59.183607 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92af746d_c60d_46a4_9be0_0ad28882ac0e.slice/crio-14141aff9fbd287a70454765b395ba76ef2991c8de80ea1c92111cb0e0c784c3 WatchSource:0}: Error finding container 14141aff9fbd287a70454765b395ba76ef2991c8de80ea1c92111cb0e0c784c3: Status 404 returned error can't find the container with id 14141aff9fbd287a70454765b395ba76ef2991c8de80ea1c92111cb0e0c784c3 Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.193124 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.253180 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.648173 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.673127 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.754744 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.848368 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 29 16:26:59 crc kubenswrapper[4886]: I0129 16:26:59.859408 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 29 16:27:00 crc kubenswrapper[4886]: I0129 16:27:00.175053 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" event={"ID":"92af746d-c60d-46a4-9be0-0ad28882ac0e","Type":"ContainerStarted","Data":"47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e"} Jan 29 16:27:00 crc kubenswrapper[4886]: I0129 16:27:00.175131 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" event={"ID":"92af746d-c60d-46a4-9be0-0ad28882ac0e","Type":"ContainerStarted","Data":"14141aff9fbd287a70454765b395ba76ef2991c8de80ea1c92111cb0e0c784c3"} Jan 29 16:27:00 crc kubenswrapper[4886]: I0129 16:27:00.199470 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" podStartSLOduration=56.19944665 podStartE2EDuration="56.19944665s" podCreationTimestamp="2026-01-29 16:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:27:00.198964385 +0000 UTC m=+303.107683657" watchObservedRunningTime="2026-01-29 16:27:00.19944665 +0000 UTC m=+303.108165962" Jan 29 16:27:00 crc kubenswrapper[4886]: I0129 16:27:00.278452 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 29 16:27:01 crc kubenswrapper[4886]: I0129 16:27:01.182132 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:27:01 crc kubenswrapper[4886]: I0129 16:27:01.191808 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:27:07 crc kubenswrapper[4886]: I0129 16:27:07.739129 4886 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 16:27:07 crc kubenswrapper[4886]: I0129 16:27:07.739843 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://e338e481af24aecd5ce5485aecf3d5729c1fbb23b68efbbc211fd833fc6aa1fa" gracePeriod=5 Jan 29 16:27:09 crc kubenswrapper[4886]: I0129 16:27:09.096961 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.002600 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.277557 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.277611 4886 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="e338e481af24aecd5ce5485aecf3d5729c1fbb23b68efbbc211fd833fc6aa1fa" exitCode=137 Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.325103 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.325206 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448016 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448114 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448137 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448167 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448207 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448225 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448399 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448452 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448429 4886 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.448265 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.459060 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.549690 4886 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.549726 4886 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.549739 4886 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:13 crc kubenswrapper[4886]: I0129 16:27:13.549754 4886 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.284950 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.285269 4886 scope.go:117] "RemoveContainer" containerID="e338e481af24aecd5ce5485aecf3d5729c1fbb23b68efbbc211fd833fc6aa1fa" Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.285398 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.623651 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.624038 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.642290 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.642386 4886 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d3430f3b-6a12-4358-ba18-177e3d6eeb69" Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.649408 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 16:27:14 crc kubenswrapper[4886]: I0129 16:27:14.649492 4886 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d3430f3b-6a12-4358-ba18-177e3d6eeb69" Jan 29 16:27:23 crc kubenswrapper[4886]: I0129 16:27:23.354035 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 29 16:27:23 crc kubenswrapper[4886]: I0129 16:27:23.358234 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 29 16:27:23 crc kubenswrapper[4886]: I0129 16:27:23.358312 4886 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4c54ca3c104e6bbe0325be1c3777b09d70215a073d7aa15018d297a353e4dbc6" exitCode=137 Jan 29 16:27:23 crc kubenswrapper[4886]: I0129 16:27:23.358391 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4c54ca3c104e6bbe0325be1c3777b09d70215a073d7aa15018d297a353e4dbc6"} Jan 29 16:27:23 crc kubenswrapper[4886]: I0129 16:27:23.358444 4886 scope.go:117] "RemoveContainer" containerID="a370948657cae25c181170bc42e45d896e01469cb4079ad6ed412210527edb08" Jan 29 16:27:24 crc kubenswrapper[4886]: I0129 16:27:24.367036 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 29 16:27:24 crc kubenswrapper[4886]: I0129 16:27:24.369179 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28ed2d1f0f1eb97b92ecd5ed5ed65125b784ec21e7527d142ec869a0c7b7cfa0"} Jan 29 16:27:26 crc kubenswrapper[4886]: I0129 16:27:26.680721 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 16:27:33 crc kubenswrapper[4886]: I0129 16:27:33.078159 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 16:27:33 crc kubenswrapper[4886]: I0129 16:27:33.086550 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 16:27:33 crc kubenswrapper[4886]: I0129 16:27:33.433993 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 16:27:39 crc kubenswrapper[4886]: I0129 16:27:39.479416 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 29 16:27:42 crc kubenswrapper[4886]: I0129 16:27:42.884925 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4rg2h"] Jan 29 16:27:42 crc kubenswrapper[4886]: I0129 16:27:42.885566 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4rg2h" podUID="4d5118e4-db44-4e09-a04d-2036e251936b" containerName="controller-manager" containerID="cri-o://074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec" gracePeriod=30 Jan 29 16:27:42 crc kubenswrapper[4886]: I0129 16:27:42.890759 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h57m9"] Jan 29 16:27:42 crc kubenswrapper[4886]: I0129 16:27:42.891085 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h57m9" podUID="eb068b0a-4b6b-48b7-bae4-ab193394f299" containerName="route-controller-manager" containerID="cri-o://bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca" gracePeriod=30 Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.311870 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h57m9" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.320440 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4rg2h" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455388 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-config\") pod \"4d5118e4-db44-4e09-a04d-2036e251936b\" (UID: \"4d5118e4-db44-4e09-a04d-2036e251936b\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455433 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-proxy-ca-bundles\") pod \"4d5118e4-db44-4e09-a04d-2036e251936b\" (UID: \"4d5118e4-db44-4e09-a04d-2036e251936b\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455464 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb068b0a-4b6b-48b7-bae4-ab193394f299-config\") pod \"eb068b0a-4b6b-48b7-bae4-ab193394f299\" (UID: \"eb068b0a-4b6b-48b7-bae4-ab193394f299\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455500 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb068b0a-4b6b-48b7-bae4-ab193394f299-client-ca\") pod \"eb068b0a-4b6b-48b7-bae4-ab193394f299\" (UID: \"eb068b0a-4b6b-48b7-bae4-ab193394f299\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455552 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d5118e4-db44-4e09-a04d-2036e251936b-serving-cert\") pod \"4d5118e4-db44-4e09-a04d-2036e251936b\" (UID: \"4d5118e4-db44-4e09-a04d-2036e251936b\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455575 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-client-ca\") pod \"4d5118e4-db44-4e09-a04d-2036e251936b\" (UID: \"4d5118e4-db44-4e09-a04d-2036e251936b\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455605 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44jkf\" (UniqueName: \"kubernetes.io/projected/4d5118e4-db44-4e09-a04d-2036e251936b-kube-api-access-44jkf\") pod \"4d5118e4-db44-4e09-a04d-2036e251936b\" (UID: \"4d5118e4-db44-4e09-a04d-2036e251936b\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455624 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h8pr\" (UniqueName: \"kubernetes.io/projected/eb068b0a-4b6b-48b7-bae4-ab193394f299-kube-api-access-6h8pr\") pod \"eb068b0a-4b6b-48b7-bae4-ab193394f299\" (UID: \"eb068b0a-4b6b-48b7-bae4-ab193394f299\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.455647 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb068b0a-4b6b-48b7-bae4-ab193394f299-serving-cert\") pod \"eb068b0a-4b6b-48b7-bae4-ab193394f299\" (UID: \"eb068b0a-4b6b-48b7-bae4-ab193394f299\") " Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.456976 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb068b0a-4b6b-48b7-bae4-ab193394f299-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb068b0a-4b6b-48b7-bae4-ab193394f299" (UID: "eb068b0a-4b6b-48b7-bae4-ab193394f299"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.457161 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb068b0a-4b6b-48b7-bae4-ab193394f299-config" (OuterVolumeSpecName: "config") pod "eb068b0a-4b6b-48b7-bae4-ab193394f299" (UID: "eb068b0a-4b6b-48b7-bae4-ab193394f299"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.457225 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-client-ca" (OuterVolumeSpecName: "client-ca") pod "4d5118e4-db44-4e09-a04d-2036e251936b" (UID: "4d5118e4-db44-4e09-a04d-2036e251936b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.457242 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4d5118e4-db44-4e09-a04d-2036e251936b" (UID: "4d5118e4-db44-4e09-a04d-2036e251936b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.457400 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-config" (OuterVolumeSpecName: "config") pod "4d5118e4-db44-4e09-a04d-2036e251936b" (UID: "4d5118e4-db44-4e09-a04d-2036e251936b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.465774 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb068b0a-4b6b-48b7-bae4-ab193394f299-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb068b0a-4b6b-48b7-bae4-ab193394f299" (UID: "eb068b0a-4b6b-48b7-bae4-ab193394f299"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.465857 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5118e4-db44-4e09-a04d-2036e251936b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4d5118e4-db44-4e09-a04d-2036e251936b" (UID: "4d5118e4-db44-4e09-a04d-2036e251936b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.466100 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5118e4-db44-4e09-a04d-2036e251936b-kube-api-access-44jkf" (OuterVolumeSpecName: "kube-api-access-44jkf") pod "4d5118e4-db44-4e09-a04d-2036e251936b" (UID: "4d5118e4-db44-4e09-a04d-2036e251936b"). InnerVolumeSpecName "kube-api-access-44jkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.471894 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb068b0a-4b6b-48b7-bae4-ab193394f299-kube-api-access-6h8pr" (OuterVolumeSpecName: "kube-api-access-6h8pr") pod "eb068b0a-4b6b-48b7-bae4-ab193394f299" (UID: "eb068b0a-4b6b-48b7-bae4-ab193394f299"). InnerVolumeSpecName "kube-api-access-6h8pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.493565 4886 generic.go:334] "Generic (PLEG): container finished" podID="eb068b0a-4b6b-48b7-bae4-ab193394f299" containerID="bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca" exitCode=0 Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.493677 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h57m9" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.498415 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h57m9" event={"ID":"eb068b0a-4b6b-48b7-bae4-ab193394f299","Type":"ContainerDied","Data":"bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca"} Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.498475 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h57m9" event={"ID":"eb068b0a-4b6b-48b7-bae4-ab193394f299","Type":"ContainerDied","Data":"5b391d085c08e1c1dfac270a21f6cff67072029830c3d61c34b03a6c51728f7e"} Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.498493 4886 scope.go:117] "RemoveContainer" containerID="bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.505870 4886 generic.go:334] "Generic (PLEG): container finished" podID="4d5118e4-db44-4e09-a04d-2036e251936b" containerID="074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec" exitCode=0 Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.505914 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4rg2h" event={"ID":"4d5118e4-db44-4e09-a04d-2036e251936b","Type":"ContainerDied","Data":"074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec"} Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.505946 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4rg2h" event={"ID":"4d5118e4-db44-4e09-a04d-2036e251936b","Type":"ContainerDied","Data":"6fff8a070d1d246b9de78c2701294ccd82667531237f5c020ada5028f01e8438"} Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.506004 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4rg2h" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.529802 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h57m9"] Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.530434 4886 scope.go:117] "RemoveContainer" containerID="bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca" Jan 29 16:27:43 crc kubenswrapper[4886]: E0129 16:27:43.530985 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca\": container with ID starting with bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca not found: ID does not exist" containerID="bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.531039 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca"} err="failed to get container status \"bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca\": rpc error: code = NotFound desc = could not find container \"bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca\": container with ID starting with bf056c7b64d1db40a273e61237f21df213f55de77057daa8d3f79b233f6b1bca not found: ID does not exist" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.531070 4886 scope.go:117] "RemoveContainer" containerID="074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.535317 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h57m9"] Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.541723 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4rg2h"] Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.546108 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4rg2h"] Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.548657 4886 scope.go:117] "RemoveContainer" containerID="074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec" Jan 29 16:27:43 crc kubenswrapper[4886]: E0129 16:27:43.549119 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec\": container with ID starting with 074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec not found: ID does not exist" containerID="074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.549152 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec"} err="failed to get container status \"074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec\": rpc error: code = NotFound desc = could not find container \"074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec\": container with ID starting with 074bdcd69e5d52baa3572c419d1d23725c2153e656e43405d65063d3d379a2ec not found: ID does not exist" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.556931 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d5118e4-db44-4e09-a04d-2036e251936b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.556968 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.556983 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44jkf\" (UniqueName: \"kubernetes.io/projected/4d5118e4-db44-4e09-a04d-2036e251936b-kube-api-access-44jkf\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.556998 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h8pr\" (UniqueName: \"kubernetes.io/projected/eb068b0a-4b6b-48b7-bae4-ab193394f299-kube-api-access-6h8pr\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.557010 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb068b0a-4b6b-48b7-bae4-ab193394f299-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.557021 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.557032 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d5118e4-db44-4e09-a04d-2036e251936b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.557042 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb068b0a-4b6b-48b7-bae4-ab193394f299-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:43 crc kubenswrapper[4886]: I0129 16:27:43.557053 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb068b0a-4b6b-48b7-bae4-ab193394f299-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.621853 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5118e4-db44-4e09-a04d-2036e251936b" path="/var/lib/kubelet/pods/4d5118e4-db44-4e09-a04d-2036e251936b/volumes" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.623012 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb068b0a-4b6b-48b7-bae4-ab193394f299" path="/var/lib/kubelet/pods/eb068b0a-4b6b-48b7-bae4-ab193394f299/volumes" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.863160 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-559577448b-qljqw"] Jan 29 16:27:44 crc kubenswrapper[4886]: E0129 16:27:44.863486 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5118e4-db44-4e09-a04d-2036e251936b" containerName="controller-manager" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.863505 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5118e4-db44-4e09-a04d-2036e251936b" containerName="controller-manager" Jan 29 16:27:44 crc kubenswrapper[4886]: E0129 16:27:44.863517 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb068b0a-4b6b-48b7-bae4-ab193394f299" containerName="route-controller-manager" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.863525 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb068b0a-4b6b-48b7-bae4-ab193394f299" containerName="route-controller-manager" Jan 29 16:27:44 crc kubenswrapper[4886]: E0129 16:27:44.863535 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.863766 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.863911 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.863925 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb068b0a-4b6b-48b7-bae4-ab193394f299" containerName="route-controller-manager" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.863938 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5118e4-db44-4e09-a04d-2036e251936b" containerName="controller-manager" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.864450 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.865993 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.866179 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.866190 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49"] Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.866477 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.866599 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.866703 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.866772 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.868568 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.868623 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.868745 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.868817 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.869064 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.869176 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.869589 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.916797 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.930230 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49"] Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.937606 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-559577448b-qljqw"] Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.972551 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-client-ca\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.972609 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d01d62e5-f921-4e41-8744-23c91bf9310a-client-ca\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.972648 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-config\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.972685 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prrn5\" (UniqueName: \"kubernetes.io/projected/d01d62e5-f921-4e41-8744-23c91bf9310a-kube-api-access-prrn5\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.972715 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d01d62e5-f921-4e41-8744-23c91bf9310a-serving-cert\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.972743 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbqww\" (UniqueName: \"kubernetes.io/projected/e7b68f8a-9483-479e-bf2d-441dff994e02-kube-api-access-sbqww\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.972764 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7b68f8a-9483-479e-bf2d-441dff994e02-serving-cert\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.973090 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-proxy-ca-bundles\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:44 crc kubenswrapper[4886]: I0129 16:27:44.973234 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d01d62e5-f921-4e41-8744-23c91bf9310a-config\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.074774 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d01d62e5-f921-4e41-8744-23c91bf9310a-client-ca\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.074851 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-config\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.074883 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prrn5\" (UniqueName: \"kubernetes.io/projected/d01d62e5-f921-4e41-8744-23c91bf9310a-kube-api-access-prrn5\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.074923 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d01d62e5-f921-4e41-8744-23c91bf9310a-serving-cert\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.074941 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbqww\" (UniqueName: \"kubernetes.io/projected/e7b68f8a-9483-479e-bf2d-441dff994e02-kube-api-access-sbqww\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.074958 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7b68f8a-9483-479e-bf2d-441dff994e02-serving-cert\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.075001 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-proxy-ca-bundles\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.075125 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d01d62e5-f921-4e41-8744-23c91bf9310a-config\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.075169 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-client-ca\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.076574 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d01d62e5-f921-4e41-8744-23c91bf9310a-client-ca\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.076804 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-proxy-ca-bundles\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.076894 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d01d62e5-f921-4e41-8744-23c91bf9310a-config\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.076902 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-config\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.078252 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-client-ca\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.088501 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d01d62e5-f921-4e41-8744-23c91bf9310a-serving-cert\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.089943 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7b68f8a-9483-479e-bf2d-441dff994e02-serving-cert\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.091109 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prrn5\" (UniqueName: \"kubernetes.io/projected/d01d62e5-f921-4e41-8744-23c91bf9310a-kube-api-access-prrn5\") pod \"route-controller-manager-5dcd866c4c-tng49\" (UID: \"d01d62e5-f921-4e41-8744-23c91bf9310a\") " pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.092445 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbqww\" (UniqueName: \"kubernetes.io/projected/e7b68f8a-9483-479e-bf2d-441dff994e02-kube-api-access-sbqww\") pod \"controller-manager-559577448b-qljqw\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.235451 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.248890 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.641171 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49"] Jan 29 16:27:45 crc kubenswrapper[4886]: W0129 16:27:45.646791 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd01d62e5_f921_4e41_8744_23c91bf9310a.slice/crio-286af561bcdf63922bcd5294e28424bd5e44bed8924f37cc13287ce7fc2c6adc WatchSource:0}: Error finding container 286af561bcdf63922bcd5294e28424bd5e44bed8924f37cc13287ce7fc2c6adc: Status 404 returned error can't find the container with id 286af561bcdf63922bcd5294e28424bd5e44bed8924f37cc13287ce7fc2c6adc Jan 29 16:27:45 crc kubenswrapper[4886]: I0129 16:27:45.694948 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-559577448b-qljqw"] Jan 29 16:27:45 crc kubenswrapper[4886]: W0129 16:27:45.703130 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b68f8a_9483_479e_bf2d_441dff994e02.slice/crio-61013901f79515c510fd797b6e9c94166fd6b2d802a9282570c4f90aaedd5f07 WatchSource:0}: Error finding container 61013901f79515c510fd797b6e9c94166fd6b2d802a9282570c4f90aaedd5f07: Status 404 returned error can't find the container with id 61013901f79515c510fd797b6e9c94166fd6b2d802a9282570c4f90aaedd5f07 Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.527367 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" event={"ID":"e7b68f8a-9483-479e-bf2d-441dff994e02","Type":"ContainerStarted","Data":"1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb"} Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.527702 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" event={"ID":"e7b68f8a-9483-479e-bf2d-441dff994e02","Type":"ContainerStarted","Data":"61013901f79515c510fd797b6e9c94166fd6b2d802a9282570c4f90aaedd5f07"} Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.527720 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.528743 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" event={"ID":"d01d62e5-f921-4e41-8744-23c91bf9310a","Type":"ContainerStarted","Data":"c5d1a86fa5476a1471825e4a1459b1da433b49876ffb5250f488558bb19e09ec"} Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.528774 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" event={"ID":"d01d62e5-f921-4e41-8744-23c91bf9310a","Type":"ContainerStarted","Data":"286af561bcdf63922bcd5294e28424bd5e44bed8924f37cc13287ce7fc2c6adc"} Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.528955 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.531421 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.535684 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.560208 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" podStartSLOduration=4.560189992 podStartE2EDuration="4.560189992s" podCreationTimestamp="2026-01-29 16:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:27:46.545238557 +0000 UTC m=+349.453957849" watchObservedRunningTime="2026-01-29 16:27:46.560189992 +0000 UTC m=+349.468909264" Jan 29 16:27:46 crc kubenswrapper[4886]: I0129 16:27:46.575275 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dcd866c4c-tng49" podStartSLOduration=4.57525874 podStartE2EDuration="4.57525874s" podCreationTimestamp="2026-01-29 16:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:27:46.574379074 +0000 UTC m=+349.483098356" watchObservedRunningTime="2026-01-29 16:27:46.57525874 +0000 UTC m=+349.483978012" Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.830657 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xcj6l"] Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.831300 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xcj6l" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerName="registry-server" containerID="cri-o://bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b" gracePeriod=30 Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.847498 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cj9vs"] Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.847824 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cj9vs" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerName="registry-server" containerID="cri-o://adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b" gracePeriod=30 Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.861200 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w8bm4"] Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.861586 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-w8bm4" podUID="17accc89-e860-4b12-b5b3-3da7adaa3430" containerName="marketplace-operator" containerID="cri-o://fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22" gracePeriod=30 Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.868424 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzc5s"] Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.868675 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xzc5s" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerName="registry-server" containerID="cri-o://233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f" gracePeriod=30 Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.872493 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtk7r"] Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.873760 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.880475 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6hph6"] Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.880860 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6hph6" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="registry-server" containerID="cri-o://9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc" gracePeriod=30 Jan 29 16:27:47 crc kubenswrapper[4886]: I0129 16:27:47.893642 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtk7r"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.021036 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.021316 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzm6k\" (UniqueName: \"kubernetes.io/projected/42b8dc70-b29d-4995-9727-9b8e032bdad9-kube-api-access-pzm6k\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.021379 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.122283 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.122397 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.122438 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzm6k\" (UniqueName: \"kubernetes.io/projected/42b8dc70-b29d-4995-9727-9b8e032bdad9-kube-api-access-pzm6k\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.123988 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.143804 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzm6k\" (UniqueName: \"kubernetes.io/projected/42b8dc70-b29d-4995-9727-9b8e032bdad9-kube-api-access-pzm6k\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.146635 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtk7r\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.282406 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.300060 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcj6l" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.379603 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hph6" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.396860 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w8bm4" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.401953 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cj9vs" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.428676 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn6qn\" (UniqueName: \"kubernetes.io/projected/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-kube-api-access-xn6qn\") pod \"047adc93-cb46-4ba7-bbdf-4d485a08ea6b\" (UID: \"047adc93-cb46-4ba7-bbdf-4d485a08ea6b\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.428989 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-utilities\") pod \"047adc93-cb46-4ba7-bbdf-4d485a08ea6b\" (UID: \"047adc93-cb46-4ba7-bbdf-4d485a08ea6b\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.429028 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-catalog-content\") pod \"047adc93-cb46-4ba7-bbdf-4d485a08ea6b\" (UID: \"047adc93-cb46-4ba7-bbdf-4d485a08ea6b\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.432955 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-utilities" (OuterVolumeSpecName: "utilities") pod "047adc93-cb46-4ba7-bbdf-4d485a08ea6b" (UID: "047adc93-cb46-4ba7-bbdf-4d485a08ea6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.439217 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-kube-api-access-xn6qn" (OuterVolumeSpecName: "kube-api-access-xn6qn") pod "047adc93-cb46-4ba7-bbdf-4d485a08ea6b" (UID: "047adc93-cb46-4ba7-bbdf-4d485a08ea6b"). InnerVolumeSpecName "kube-api-access-xn6qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.445724 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzc5s" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.509250 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "047adc93-cb46-4ba7-bbdf-4d485a08ea6b" (UID: "047adc93-cb46-4ba7-bbdf-4d485a08ea6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530426 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434ccaea-8a30-4a97-8908-64bc9f550de0-catalog-content\") pod \"434ccaea-8a30-4a97-8908-64bc9f550de0\" (UID: \"434ccaea-8a30-4a97-8908-64bc9f550de0\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530510 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434ccaea-8a30-4a97-8908-64bc9f550de0-utilities\") pod \"434ccaea-8a30-4a97-8908-64bc9f550de0\" (UID: \"434ccaea-8a30-4a97-8908-64bc9f550de0\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530572 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/17accc89-e860-4b12-b5b3-3da7adaa3430-marketplace-operator-metrics\") pod \"17accc89-e860-4b12-b5b3-3da7adaa3430\" (UID: \"17accc89-e860-4b12-b5b3-3da7adaa3430\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530605 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gjgt\" (UniqueName: \"kubernetes.io/projected/434ccaea-8a30-4a97-8908-64bc9f550de0-kube-api-access-4gjgt\") pod \"434ccaea-8a30-4a97-8908-64bc9f550de0\" (UID: \"434ccaea-8a30-4a97-8908-64bc9f550de0\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530657 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf8xv\" (UniqueName: \"kubernetes.io/projected/c36e6697-37b9-4b10-baea-0f9c92014c79-kube-api-access-qf8xv\") pod \"c36e6697-37b9-4b10-baea-0f9c92014c79\" (UID: \"c36e6697-37b9-4b10-baea-0f9c92014c79\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530699 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbgjh\" (UniqueName: \"kubernetes.io/projected/17accc89-e860-4b12-b5b3-3da7adaa3430-kube-api-access-fbgjh\") pod \"17accc89-e860-4b12-b5b3-3da7adaa3430\" (UID: \"17accc89-e860-4b12-b5b3-3da7adaa3430\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530750 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xncm2\" (UniqueName: \"kubernetes.io/projected/d8a07d27-67fb-47e8-9032-e4f831983d75-kube-api-access-xncm2\") pod \"d8a07d27-67fb-47e8-9032-e4f831983d75\" (UID: \"d8a07d27-67fb-47e8-9032-e4f831983d75\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530778 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c36e6697-37b9-4b10-baea-0f9c92014c79-utilities\") pod \"c36e6697-37b9-4b10-baea-0f9c92014c79\" (UID: \"c36e6697-37b9-4b10-baea-0f9c92014c79\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530840 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a07d27-67fb-47e8-9032-e4f831983d75-catalog-content\") pod \"d8a07d27-67fb-47e8-9032-e4f831983d75\" (UID: \"d8a07d27-67fb-47e8-9032-e4f831983d75\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.530864 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c36e6697-37b9-4b10-baea-0f9c92014c79-catalog-content\") pod \"c36e6697-37b9-4b10-baea-0f9c92014c79\" (UID: \"c36e6697-37b9-4b10-baea-0f9c92014c79\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.531547 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17accc89-e860-4b12-b5b3-3da7adaa3430-marketplace-trusted-ca\") pod \"17accc89-e860-4b12-b5b3-3da7adaa3430\" (UID: \"17accc89-e860-4b12-b5b3-3da7adaa3430\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.531626 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a07d27-67fb-47e8-9032-e4f831983d75-utilities\") pod \"d8a07d27-67fb-47e8-9032-e4f831983d75\" (UID: \"d8a07d27-67fb-47e8-9032-e4f831983d75\") " Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.531995 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn6qn\" (UniqueName: \"kubernetes.io/projected/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-kube-api-access-xn6qn\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.532051 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.532067 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047adc93-cb46-4ba7-bbdf-4d485a08ea6b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.531465 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434ccaea-8a30-4a97-8908-64bc9f550de0-utilities" (OuterVolumeSpecName: "utilities") pod "434ccaea-8a30-4a97-8908-64bc9f550de0" (UID: "434ccaea-8a30-4a97-8908-64bc9f550de0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.533051 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a07d27-67fb-47e8-9032-e4f831983d75-utilities" (OuterVolumeSpecName: "utilities") pod "d8a07d27-67fb-47e8-9032-e4f831983d75" (UID: "d8a07d27-67fb-47e8-9032-e4f831983d75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.533430 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36e6697-37b9-4b10-baea-0f9c92014c79-utilities" (OuterVolumeSpecName: "utilities") pod "c36e6697-37b9-4b10-baea-0f9c92014c79" (UID: "c36e6697-37b9-4b10-baea-0f9c92014c79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.535483 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17accc89-e860-4b12-b5b3-3da7adaa3430-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "17accc89-e860-4b12-b5b3-3da7adaa3430" (UID: "17accc89-e860-4b12-b5b3-3da7adaa3430"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.536087 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434ccaea-8a30-4a97-8908-64bc9f550de0-kube-api-access-4gjgt" (OuterVolumeSpecName: "kube-api-access-4gjgt") pod "434ccaea-8a30-4a97-8908-64bc9f550de0" (UID: "434ccaea-8a30-4a97-8908-64bc9f550de0"). InnerVolumeSpecName "kube-api-access-4gjgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.537219 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a07d27-67fb-47e8-9032-e4f831983d75-kube-api-access-xncm2" (OuterVolumeSpecName: "kube-api-access-xncm2") pod "d8a07d27-67fb-47e8-9032-e4f831983d75" (UID: "d8a07d27-67fb-47e8-9032-e4f831983d75"). InnerVolumeSpecName "kube-api-access-xncm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.538567 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c36e6697-37b9-4b10-baea-0f9c92014c79-kube-api-access-qf8xv" (OuterVolumeSpecName: "kube-api-access-qf8xv") pod "c36e6697-37b9-4b10-baea-0f9c92014c79" (UID: "c36e6697-37b9-4b10-baea-0f9c92014c79"). InnerVolumeSpecName "kube-api-access-qf8xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.542123 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17accc89-e860-4b12-b5b3-3da7adaa3430-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "17accc89-e860-4b12-b5b3-3da7adaa3430" (UID: "17accc89-e860-4b12-b5b3-3da7adaa3430"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.543275 4886 generic.go:334] "Generic (PLEG): container finished" podID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerID="233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f" exitCode=0 Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.543349 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzc5s" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.543391 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzc5s" event={"ID":"d8a07d27-67fb-47e8-9032-e4f831983d75","Type":"ContainerDied","Data":"233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.543449 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzc5s" event={"ID":"d8a07d27-67fb-47e8-9032-e4f831983d75","Type":"ContainerDied","Data":"8df354200569f756ef71068446371a43cfad097210faf33ea3e2d3966f2eb917"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.543498 4886 scope.go:117] "RemoveContainer" containerID="233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.552544 4886 generic.go:334] "Generic (PLEG): container finished" podID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerID="adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b" exitCode=0 Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.552690 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cj9vs" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.552727 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj9vs" event={"ID":"434ccaea-8a30-4a97-8908-64bc9f550de0","Type":"ContainerDied","Data":"adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.552764 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cj9vs" event={"ID":"434ccaea-8a30-4a97-8908-64bc9f550de0","Type":"ContainerDied","Data":"c930283727a8af009300e17c576da570a17d69226a2431e0b8f6442ab7a33682"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.553116 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17accc89-e860-4b12-b5b3-3da7adaa3430-kube-api-access-fbgjh" (OuterVolumeSpecName: "kube-api-access-fbgjh") pod "17accc89-e860-4b12-b5b3-3da7adaa3430" (UID: "17accc89-e860-4b12-b5b3-3da7adaa3430"). InnerVolumeSpecName "kube-api-access-fbgjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.555398 4886 generic.go:334] "Generic (PLEG): container finished" podID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerID="9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc" exitCode=0 Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.555448 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hph6" event={"ID":"c36e6697-37b9-4b10-baea-0f9c92014c79","Type":"ContainerDied","Data":"9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.555469 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hph6" event={"ID":"c36e6697-37b9-4b10-baea-0f9c92014c79","Type":"ContainerDied","Data":"2597500a6782cab3fff1d1bf05e088755f933968f6726da1d1dcae802c73e7f3"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.555535 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hph6" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.564880 4886 generic.go:334] "Generic (PLEG): container finished" podID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerID="bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b" exitCode=0 Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.564930 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcj6l" event={"ID":"047adc93-cb46-4ba7-bbdf-4d485a08ea6b","Type":"ContainerDied","Data":"bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.564952 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcj6l" event={"ID":"047adc93-cb46-4ba7-bbdf-4d485a08ea6b","Type":"ContainerDied","Data":"b49a4641d27203a40e0f7e4f28f82c1063741221c6c208a86d4e1a5bc30f7000"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.565009 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcj6l" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.566525 4886 scope.go:117] "RemoveContainer" containerID="ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.568600 4886 generic.go:334] "Generic (PLEG): container finished" podID="17accc89-e860-4b12-b5b3-3da7adaa3430" containerID="fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22" exitCode=0 Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.568704 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w8bm4" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.568725 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w8bm4" event={"ID":"17accc89-e860-4b12-b5b3-3da7adaa3430","Type":"ContainerDied","Data":"fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.568776 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w8bm4" event={"ID":"17accc89-e860-4b12-b5b3-3da7adaa3430","Type":"ContainerDied","Data":"496e5ab4c79c2396e707c4fc94a4d2815e8f1572d6df45519acda3977888c122"} Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.576632 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a07d27-67fb-47e8-9032-e4f831983d75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8a07d27-67fb-47e8-9032-e4f831983d75" (UID: "d8a07d27-67fb-47e8-9032-e4f831983d75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.589265 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434ccaea-8a30-4a97-8908-64bc9f550de0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "434ccaea-8a30-4a97-8908-64bc9f550de0" (UID: "434ccaea-8a30-4a97-8908-64bc9f550de0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.590813 4886 scope.go:117] "RemoveContainer" containerID="3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.593758 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xcj6l"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.600704 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xcj6l"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.604510 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w8bm4"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.607145 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w8bm4"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.621305 4886 scope.go:117] "RemoveContainer" containerID="233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.621801 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f\": container with ID starting with 233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f not found: ID does not exist" containerID="233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.621835 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f"} err="failed to get container status \"233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f\": rpc error: code = NotFound desc = could not find container \"233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f\": container with ID starting with 233eefe83f891bb8ff6279b8ca319fdb899c0d7dc84bfe73ee251483fff54d0f not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.621858 4886 scope.go:117] "RemoveContainer" containerID="ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.622383 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153\": container with ID starting with ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153 not found: ID does not exist" containerID="ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.622419 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153"} err="failed to get container status \"ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153\": rpc error: code = NotFound desc = could not find container \"ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153\": container with ID starting with ceae5fdac3eed7f1c5974c445ed3419dbfa10feff4c8309145af3e9ea005f153 not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.622445 4886 scope.go:117] "RemoveContainer" containerID="3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.622658 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd\": container with ID starting with 3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd not found: ID does not exist" containerID="3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.622683 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd"} err="failed to get container status \"3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd\": rpc error: code = NotFound desc = could not find container \"3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd\": container with ID starting with 3fb3181dff0539237c77e3f3e6bfc2daf84ba731ba94f2127334c7ba90e867dd not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.622697 4886 scope.go:117] "RemoveContainer" containerID="adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.624749 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" path="/var/lib/kubelet/pods/047adc93-cb46-4ba7-bbdf-4d485a08ea6b/volumes" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.625452 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17accc89-e860-4b12-b5b3-3da7adaa3430" path="/var/lib/kubelet/pods/17accc89-e860-4b12-b5b3-3da7adaa3430/volumes" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633306 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c36e6697-37b9-4b10-baea-0f9c92014c79-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633397 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a07d27-67fb-47e8-9032-e4f831983d75-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633410 4886 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17accc89-e860-4b12-b5b3-3da7adaa3430-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633421 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a07d27-67fb-47e8-9032-e4f831983d75-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633430 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434ccaea-8a30-4a97-8908-64bc9f550de0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633438 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434ccaea-8a30-4a97-8908-64bc9f550de0-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633446 4886 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/17accc89-e860-4b12-b5b3-3da7adaa3430-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633455 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gjgt\" (UniqueName: \"kubernetes.io/projected/434ccaea-8a30-4a97-8908-64bc9f550de0-kube-api-access-4gjgt\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633464 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf8xv\" (UniqueName: \"kubernetes.io/projected/c36e6697-37b9-4b10-baea-0f9c92014c79-kube-api-access-qf8xv\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633474 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbgjh\" (UniqueName: \"kubernetes.io/projected/17accc89-e860-4b12-b5b3-3da7adaa3430-kube-api-access-fbgjh\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.633484 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xncm2\" (UniqueName: \"kubernetes.io/projected/d8a07d27-67fb-47e8-9032-e4f831983d75-kube-api-access-xncm2\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.635733 4886 scope.go:117] "RemoveContainer" containerID="5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.652837 4886 scope.go:117] "RemoveContainer" containerID="9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.664197 4886 scope.go:117] "RemoveContainer" containerID="adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.664597 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b\": container with ID starting with adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b not found: ID does not exist" containerID="adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.664644 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b"} err="failed to get container status \"adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b\": rpc error: code = NotFound desc = could not find container \"adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b\": container with ID starting with adf2c14310b6a7ba403bcc63dd65fff6abbc7aa1ceb7c9a65b7e84de9cf1376b not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.664680 4886 scope.go:117] "RemoveContainer" containerID="5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.664944 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da\": container with ID starting with 5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da not found: ID does not exist" containerID="5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.665006 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da"} err="failed to get container status \"5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da\": rpc error: code = NotFound desc = could not find container \"5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da\": container with ID starting with 5848b4e5a6379779bfe01d51a16e2bc5ee511c62178bbd791e055867e63873da not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.665026 4886 scope.go:117] "RemoveContainer" containerID="9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.665248 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91\": container with ID starting with 9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91 not found: ID does not exist" containerID="9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.665276 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91"} err="failed to get container status \"9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91\": rpc error: code = NotFound desc = could not find container \"9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91\": container with ID starting with 9b90bb78250828a8de92c52ee575ca760465a8522cc7fc51c14297899de5ae91 not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.665291 4886 scope.go:117] "RemoveContainer" containerID="9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.676931 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36e6697-37b9-4b10-baea-0f9c92014c79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c36e6697-37b9-4b10-baea-0f9c92014c79" (UID: "c36e6697-37b9-4b10-baea-0f9c92014c79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.688695 4886 scope.go:117] "RemoveContainer" containerID="7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.707130 4886 scope.go:117] "RemoveContainer" containerID="0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.721468 4886 scope.go:117] "RemoveContainer" containerID="9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.722154 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc\": container with ID starting with 9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc not found: ID does not exist" containerID="9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.722196 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc"} err="failed to get container status \"9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc\": rpc error: code = NotFound desc = could not find container \"9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc\": container with ID starting with 9d4035b0a0d02345b7ffc32586d2f6e1f50c9f460c46150e1796f4be0de2d1cc not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.722224 4886 scope.go:117] "RemoveContainer" containerID="7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.722593 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e\": container with ID starting with 7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e not found: ID does not exist" containerID="7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.722633 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e"} err="failed to get container status \"7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e\": rpc error: code = NotFound desc = could not find container \"7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e\": container with ID starting with 7344b3cddb96e29cffb588d3f380405658d001e938c3fd9a59f0d4c9ea5aa16e not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.722681 4886 scope.go:117] "RemoveContainer" containerID="0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.723972 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17\": container with ID starting with 0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17 not found: ID does not exist" containerID="0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.724018 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17"} err="failed to get container status \"0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17\": rpc error: code = NotFound desc = could not find container \"0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17\": container with ID starting with 0cdb18d5f5fa9a44559e46fd01c9effbb1ab6cf3c5ac5db03199ac60dda03f17 not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.724184 4886 scope.go:117] "RemoveContainer" containerID="bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.735257 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c36e6697-37b9-4b10-baea-0f9c92014c79-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.738021 4886 scope.go:117] "RemoveContainer" containerID="11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.803826 4886 scope.go:117] "RemoveContainer" containerID="587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.823113 4886 scope.go:117] "RemoveContainer" containerID="bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.825572 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b\": container with ID starting with bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b not found: ID does not exist" containerID="bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.825763 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b"} err="failed to get container status \"bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b\": rpc error: code = NotFound desc = could not find container \"bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b\": container with ID starting with bd7f7f68af6c019f5874ecc65bfcb6fd76594d7f15c29ffa88fbdeda070e9c5b not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.825884 4886 scope.go:117] "RemoveContainer" containerID="11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.826327 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf\": container with ID starting with 11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf not found: ID does not exist" containerID="11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.826380 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf"} err="failed to get container status \"11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf\": rpc error: code = NotFound desc = could not find container \"11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf\": container with ID starting with 11d0ed20cabb97cd96a252527a2f57cbc3a01707b987d53593bc18c03df398cf not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.826407 4886 scope.go:117] "RemoveContainer" containerID="587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.827582 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5\": container with ID starting with 587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5 not found: ID does not exist" containerID="587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.827624 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5"} err="failed to get container status \"587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5\": rpc error: code = NotFound desc = could not find container \"587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5\": container with ID starting with 587e95e478255c5ab7978918eda8a5869d425a31c3fad8525cf07ea38da482d5 not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.827650 4886 scope.go:117] "RemoveContainer" containerID="fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.827697 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtk7r"] Jan 29 16:27:48 crc kubenswrapper[4886]: W0129 16:27:48.833424 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b8dc70_b29d_4995_9727_9b8e032bdad9.slice/crio-648bc592f49ae3cedaf90d37922cbc1e1495121ad8e957f81f4908846b5e05da WatchSource:0}: Error finding container 648bc592f49ae3cedaf90d37922cbc1e1495121ad8e957f81f4908846b5e05da: Status 404 returned error can't find the container with id 648bc592f49ae3cedaf90d37922cbc1e1495121ad8e957f81f4908846b5e05da Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.849642 4886 scope.go:117] "RemoveContainer" containerID="fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22" Jan 29 16:27:48 crc kubenswrapper[4886]: E0129 16:27:48.850050 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22\": container with ID starting with fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22 not found: ID does not exist" containerID="fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.850114 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22"} err="failed to get container status \"fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22\": rpc error: code = NotFound desc = could not find container \"fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22\": container with ID starting with fd7fef5ae316b90316f06b6e489cce7174661acd1d0b44078f269a28b56f1f22 not found: ID does not exist" Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.868472 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzc5s"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.873870 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzc5s"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.880834 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cj9vs"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.886556 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cj9vs"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.902496 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6hph6"] Jan 29 16:27:48 crc kubenswrapper[4886]: I0129 16:27:48.906210 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6hph6"] Jan 29 16:27:49 crc kubenswrapper[4886]: I0129 16:27:49.593862 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" event={"ID":"42b8dc70-b29d-4995-9727-9b8e032bdad9","Type":"ContainerStarted","Data":"f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e"} Jan 29 16:27:49 crc kubenswrapper[4886]: I0129 16:27:49.594172 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" event={"ID":"42b8dc70-b29d-4995-9727-9b8e032bdad9","Type":"ContainerStarted","Data":"648bc592f49ae3cedaf90d37922cbc1e1495121ad8e957f81f4908846b5e05da"} Jan 29 16:27:49 crc kubenswrapper[4886]: I0129 16:27:49.594193 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:49 crc kubenswrapper[4886]: I0129 16:27:49.599794 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:27:49 crc kubenswrapper[4886]: I0129 16:27:49.617147 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" podStartSLOduration=2.617014451 podStartE2EDuration="2.617014451s" podCreationTimestamp="2026-01-29 16:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:27:49.613156387 +0000 UTC m=+352.521875669" watchObservedRunningTime="2026-01-29 16:27:49.617014451 +0000 UTC m=+352.525733733" Jan 29 16:27:50 crc kubenswrapper[4886]: I0129 16:27:50.623012 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" path="/var/lib/kubelet/pods/434ccaea-8a30-4a97-8908-64bc9f550de0/volumes" Jan 29 16:27:50 crc kubenswrapper[4886]: I0129 16:27:50.623795 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" path="/var/lib/kubelet/pods/c36e6697-37b9-4b10-baea-0f9c92014c79/volumes" Jan 29 16:27:50 crc kubenswrapper[4886]: I0129 16:27:50.624487 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" path="/var/lib/kubelet/pods/d8a07d27-67fb-47e8-9032-e4f831983d75/volumes" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.262970 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jfv6k"] Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.263851 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerName="extract-utilities" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.263870 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerName="extract-utilities" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.263884 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerName="extract-content" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.263893 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerName="extract-content" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.263902 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerName="extract-utilities" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.263910 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerName="extract-utilities" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.263922 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.263929 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.263941 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.263950 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.263962 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.263969 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.263981 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.263988 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.264002 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerName="extract-utilities" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264010 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerName="extract-utilities" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.264022 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerName="extract-content" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264030 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerName="extract-content" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.264039 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="extract-content" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264047 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="extract-content" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.264057 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="extract-utilities" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264065 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="extract-utilities" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.264075 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17accc89-e860-4b12-b5b3-3da7adaa3430" containerName="marketplace-operator" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264082 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="17accc89-e860-4b12-b5b3-3da7adaa3430" containerName="marketplace-operator" Jan 29 16:28:04 crc kubenswrapper[4886]: E0129 16:28:04.264097 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerName="extract-content" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264105 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerName="extract-content" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264216 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a07d27-67fb-47e8-9032-e4f831983d75" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264229 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="17accc89-e860-4b12-b5b3-3da7adaa3430" containerName="marketplace-operator" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264241 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="047adc93-cb46-4ba7-bbdf-4d485a08ea6b" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264266 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="434ccaea-8a30-4a97-8908-64bc9f550de0" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.264276 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36e6697-37b9-4b10-baea-0f9c92014c79" containerName="registry-server" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.265245 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.318937 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.327618 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mlnk\" (UniqueName: \"kubernetes.io/projected/69003a39-1c09-4087-a494-ebfd69e973cf-kube-api-access-5mlnk\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.327779 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-utilities\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.327908 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-catalog-content\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.334027 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfv6k"] Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.428734 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-catalog-content\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.428776 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mlnk\" (UniqueName: \"kubernetes.io/projected/69003a39-1c09-4087-a494-ebfd69e973cf-kube-api-access-5mlnk\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.428823 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-utilities\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.429299 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-utilities\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.430209 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-catalog-content\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.448201 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mlnk\" (UniqueName: \"kubernetes.io/projected/69003a39-1c09-4087-a494-ebfd69e973cf-kube-api-access-5mlnk\") pod \"certified-operators-jfv6k\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.648068 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.863845 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q5hs7"] Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.865495 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.872213 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.874635 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q5hs7"] Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.933894 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-utilities\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.933949 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-catalog-content\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:04 crc kubenswrapper[4886]: I0129 16:28:04.933969 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8jsj\" (UniqueName: \"kubernetes.io/projected/a7325ad0-28bf-45e0-bbd5-160f441de091-kube-api-access-c8jsj\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.035522 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-utilities\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.035670 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-catalog-content\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.035703 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8jsj\" (UniqueName: \"kubernetes.io/projected/a7325ad0-28bf-45e0-bbd5-160f441de091-kube-api-access-c8jsj\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.035960 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-utilities\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.036176 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-catalog-content\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.054564 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8jsj\" (UniqueName: \"kubernetes.io/projected/a7325ad0-28bf-45e0-bbd5-160f441de091-kube-api-access-c8jsj\") pod \"community-operators-q5hs7\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.063868 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfv6k"] Jan 29 16:28:05 crc kubenswrapper[4886]: W0129 16:28:05.071063 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69003a39_1c09_4087_a494_ebfd69e973cf.slice/crio-e4d88167fe4815cd042b435714fee0326b8557c7e5fb2b46e9557a042ac995f8 WatchSource:0}: Error finding container e4d88167fe4815cd042b435714fee0326b8557c7e5fb2b46e9557a042ac995f8: Status 404 returned error can't find the container with id e4d88167fe4815cd042b435714fee0326b8557c7e5fb2b46e9557a042ac995f8 Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.191874 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.569104 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q5hs7"] Jan 29 16:28:05 crc kubenswrapper[4886]: W0129 16:28:05.578472 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7325ad0_28bf_45e0_bbd5_160f441de091.slice/crio-58e358a0eb4540bb049b243d60b0ba858eec19efdffef34538e1bbcdff0edbc6 WatchSource:0}: Error finding container 58e358a0eb4540bb049b243d60b0ba858eec19efdffef34538e1bbcdff0edbc6: Status 404 returned error can't find the container with id 58e358a0eb4540bb049b243d60b0ba858eec19efdffef34538e1bbcdff0edbc6 Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.684945 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5hs7" event={"ID":"a7325ad0-28bf-45e0-bbd5-160f441de091","Type":"ContainerStarted","Data":"58e358a0eb4540bb049b243d60b0ba858eec19efdffef34538e1bbcdff0edbc6"} Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.686977 4886 generic.go:334] "Generic (PLEG): container finished" podID="69003a39-1c09-4087-a494-ebfd69e973cf" containerID="9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647" exitCode=0 Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.687039 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfv6k" event={"ID":"69003a39-1c09-4087-a494-ebfd69e973cf","Type":"ContainerDied","Data":"9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647"} Jan 29 16:28:05 crc kubenswrapper[4886]: I0129 16:28:05.687078 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfv6k" event={"ID":"69003a39-1c09-4087-a494-ebfd69e973cf","Type":"ContainerStarted","Data":"e4d88167fe4815cd042b435714fee0326b8557c7e5fb2b46e9557a042ac995f8"} Jan 29 16:28:05 crc kubenswrapper[4886]: E0129 16:28:05.814257 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 16:28:05 crc kubenswrapper[4886]: E0129 16:28:05.814405 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jfv6k_openshift-marketplace(69003a39-1c09-4087-a494-ebfd69e973cf): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:05 crc kubenswrapper[4886]: E0129 16:28:05.815595 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.666680 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4qbl4"] Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.668606 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.672101 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qbl4"] Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.672286 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.704749 4886 generic.go:334] "Generic (PLEG): container finished" podID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerID="bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d" exitCode=0 Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.704847 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5hs7" event={"ID":"a7325ad0-28bf-45e0-bbd5-160f441de091","Type":"ContainerDied","Data":"bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d"} Jan 29 16:28:06 crc kubenswrapper[4886]: E0129 16:28:06.706651 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:28:06 crc kubenswrapper[4886]: E0129 16:28:06.835958 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 16:28:06 crc kubenswrapper[4886]: E0129 16:28:06.836161 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8jsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q5hs7_openshift-marketplace(a7325ad0-28bf-45e0-bbd5-160f441de091): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:06 crc kubenswrapper[4886]: E0129 16:28:06.837768 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.856721 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf7sq\" (UniqueName: \"kubernetes.io/projected/57aa9115-b2d5-45aa-8ac3-e251c0907e45-kube-api-access-vf7sq\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.856767 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-catalog-content\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.856812 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-utilities\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.958153 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-catalog-content\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.958248 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-utilities\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.958309 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf7sq\" (UniqueName: \"kubernetes.io/projected/57aa9115-b2d5-45aa-8ac3-e251c0907e45-kube-api-access-vf7sq\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.959254 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-catalog-content\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.959538 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-utilities\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:06 crc kubenswrapper[4886]: I0129 16:28:06.981650 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf7sq\" (UniqueName: \"kubernetes.io/projected/57aa9115-b2d5-45aa-8ac3-e251c0907e45-kube-api-access-vf7sq\") pod \"redhat-marketplace-4qbl4\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.016508 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.263089 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zkk68"] Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.264434 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.266360 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.270861 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkk68"] Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.363530 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn92n\" (UniqueName: \"kubernetes.io/projected/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-kube-api-access-vn92n\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.363616 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-catalog-content\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.363672 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-utilities\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.440309 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qbl4"] Jan 29 16:28:07 crc kubenswrapper[4886]: W0129 16:28:07.455044 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57aa9115_b2d5_45aa_8ac3_e251c0907e45.slice/crio-68d81ee76eccd615ba9046c4c1e6648df9ef22ce6eee6d566d9309dd619e6010 WatchSource:0}: Error finding container 68d81ee76eccd615ba9046c4c1e6648df9ef22ce6eee6d566d9309dd619e6010: Status 404 returned error can't find the container with id 68d81ee76eccd615ba9046c4c1e6648df9ef22ce6eee6d566d9309dd619e6010 Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.465130 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn92n\" (UniqueName: \"kubernetes.io/projected/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-kube-api-access-vn92n\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.465194 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-catalog-content\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.465225 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-utilities\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.465693 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-catalog-content\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.465748 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-utilities\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.493415 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn92n\" (UniqueName: \"kubernetes.io/projected/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-kube-api-access-vn92n\") pod \"redhat-operators-zkk68\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.582203 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.713596 4886 generic.go:334] "Generic (PLEG): container finished" podID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerID="9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1" exitCode=0 Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.713643 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qbl4" event={"ID":"57aa9115-b2d5-45aa-8ac3-e251c0907e45","Type":"ContainerDied","Data":"9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1"} Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.713688 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qbl4" event={"ID":"57aa9115-b2d5-45aa-8ac3-e251c0907e45","Type":"ContainerStarted","Data":"68d81ee76eccd615ba9046c4c1e6648df9ef22ce6eee6d566d9309dd619e6010"} Jan 29 16:28:07 crc kubenswrapper[4886]: E0129 16:28:07.715284 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:28:07 crc kubenswrapper[4886]: E0129 16:28:07.843424 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:28:07 crc kubenswrapper[4886]: E0129 16:28:07.843630 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf7sq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4qbl4_openshift-marketplace(57aa9115-b2d5-45aa-8ac3-e251c0907e45): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:07 crc kubenswrapper[4886]: E0129 16:28:07.844833 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:28:07 crc kubenswrapper[4886]: I0129 16:28:07.970624 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkk68"] Jan 29 16:28:07 crc kubenswrapper[4886]: W0129 16:28:07.988491 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd84ce3e9_c41a_4a08_8d86_2a918d5e9450.slice/crio-1de9e48715ad861e4d8bd78cecc12c2dcf52cdf92d4274338ddeebf931d7420d WatchSource:0}: Error finding container 1de9e48715ad861e4d8bd78cecc12c2dcf52cdf92d4274338ddeebf931d7420d: Status 404 returned error can't find the container with id 1de9e48715ad861e4d8bd78cecc12c2dcf52cdf92d4274338ddeebf931d7420d Jan 29 16:28:08 crc kubenswrapper[4886]: I0129 16:28:08.719473 4886 generic.go:334] "Generic (PLEG): container finished" podID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerID="9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6" exitCode=0 Jan 29 16:28:08 crc kubenswrapper[4886]: I0129 16:28:08.719536 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkk68" event={"ID":"d84ce3e9-c41a-4a08-8d86-2a918d5e9450","Type":"ContainerDied","Data":"9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6"} Jan 29 16:28:08 crc kubenswrapper[4886]: I0129 16:28:08.719586 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkk68" event={"ID":"d84ce3e9-c41a-4a08-8d86-2a918d5e9450","Type":"ContainerStarted","Data":"1de9e48715ad861e4d8bd78cecc12c2dcf52cdf92d4274338ddeebf931d7420d"} Jan 29 16:28:08 crc kubenswrapper[4886]: E0129 16:28:08.721137 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:28:08 crc kubenswrapper[4886]: E0129 16:28:08.849458 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 16:28:08 crc kubenswrapper[4886]: E0129 16:28:08.853851 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn92n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zkk68_openshift-marketplace(d84ce3e9-c41a-4a08-8d86-2a918d5e9450): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:08 crc kubenswrapper[4886]: E0129 16:28:08.855644 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:28:09 crc kubenswrapper[4886]: E0129 16:28:09.726299 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:28:11 crc kubenswrapper[4886]: I0129 16:28:11.648433 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-559577448b-qljqw"] Jan 29 16:28:11 crc kubenswrapper[4886]: I0129 16:28:11.649022 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" podUID="e7b68f8a-9483-479e-bf2d-441dff994e02" containerName="controller-manager" containerID="cri-o://1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb" gracePeriod=30 Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.142802 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.223957 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-client-ca\") pod \"e7b68f8a-9483-479e-bf2d-441dff994e02\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.224002 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbqww\" (UniqueName: \"kubernetes.io/projected/e7b68f8a-9483-479e-bf2d-441dff994e02-kube-api-access-sbqww\") pod \"e7b68f8a-9483-479e-bf2d-441dff994e02\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.224026 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-config\") pod \"e7b68f8a-9483-479e-bf2d-441dff994e02\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.225362 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-client-ca" (OuterVolumeSpecName: "client-ca") pod "e7b68f8a-9483-479e-bf2d-441dff994e02" (UID: "e7b68f8a-9483-479e-bf2d-441dff994e02"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.225426 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-config" (OuterVolumeSpecName: "config") pod "e7b68f8a-9483-479e-bf2d-441dff994e02" (UID: "e7b68f8a-9483-479e-bf2d-441dff994e02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.230238 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b68f8a-9483-479e-bf2d-441dff994e02-kube-api-access-sbqww" (OuterVolumeSpecName: "kube-api-access-sbqww") pod "e7b68f8a-9483-479e-bf2d-441dff994e02" (UID: "e7b68f8a-9483-479e-bf2d-441dff994e02"). InnerVolumeSpecName "kube-api-access-sbqww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.324759 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-proxy-ca-bundles\") pod \"e7b68f8a-9483-479e-bf2d-441dff994e02\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.324934 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7b68f8a-9483-479e-bf2d-441dff994e02-serving-cert\") pod \"e7b68f8a-9483-479e-bf2d-441dff994e02\" (UID: \"e7b68f8a-9483-479e-bf2d-441dff994e02\") " Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.325424 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbqww\" (UniqueName: \"kubernetes.io/projected/e7b68f8a-9483-479e-bf2d-441dff994e02-kube-api-access-sbqww\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.325422 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e7b68f8a-9483-479e-bf2d-441dff994e02" (UID: "e7b68f8a-9483-479e-bf2d-441dff994e02"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.325483 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.325508 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.329220 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b68f8a-9483-479e-bf2d-441dff994e02-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7b68f8a-9483-479e-bf2d-441dff994e02" (UID: "e7b68f8a-9483-479e-bf2d-441dff994e02"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.425965 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7b68f8a-9483-479e-bf2d-441dff994e02-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.426015 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7b68f8a-9483-479e-bf2d-441dff994e02-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.738898 4886 generic.go:334] "Generic (PLEG): container finished" podID="e7b68f8a-9483-479e-bf2d-441dff994e02" containerID="1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb" exitCode=0 Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.738941 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" event={"ID":"e7b68f8a-9483-479e-bf2d-441dff994e02","Type":"ContainerDied","Data":"1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb"} Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.738972 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" event={"ID":"e7b68f8a-9483-479e-bf2d-441dff994e02","Type":"ContainerDied","Data":"61013901f79515c510fd797b6e9c94166fd6b2d802a9282570c4f90aaedd5f07"} Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.738991 4886 scope.go:117] "RemoveContainer" containerID="1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.739104 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559577448b-qljqw" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.755910 4886 scope.go:117] "RemoveContainer" containerID="1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb" Jan 29 16:28:12 crc kubenswrapper[4886]: E0129 16:28:12.756519 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb\": container with ID starting with 1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb not found: ID does not exist" containerID="1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.756547 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb"} err="failed to get container status \"1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb\": rpc error: code = NotFound desc = could not find container \"1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb\": container with ID starting with 1baf76b04c25852c14f6eddaeefa7479b2d32f63cecc26a393263dba5b8aedfb not found: ID does not exist" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.762938 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-559577448b-qljqw"] Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.771349 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-559577448b-qljqw"] Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.882035 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c58fc677-rq8vv"] Jan 29 16:28:12 crc kubenswrapper[4886]: E0129 16:28:12.882505 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b68f8a-9483-479e-bf2d-441dff994e02" containerName="controller-manager" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.882518 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b68f8a-9483-479e-bf2d-441dff994e02" containerName="controller-manager" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.882611 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b68f8a-9483-479e-bf2d-441dff994e02" containerName="controller-manager" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.882942 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.884462 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.885213 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.885236 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.885290 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.886722 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.886811 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.892878 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c58fc677-rq8vv"] Jan 29 16:28:12 crc kubenswrapper[4886]: I0129 16:28:12.898120 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.033439 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2j4h\" (UniqueName: \"kubernetes.io/projected/f13f8975-f61d-4cf6-8a08-76e4427efada-kube-api-access-j2j4h\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.033578 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-client-ca\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.033628 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-config\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.033663 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-proxy-ca-bundles\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.033842 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f13f8975-f61d-4cf6-8a08-76e4427efada-serving-cert\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.135193 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f13f8975-f61d-4cf6-8a08-76e4427efada-serving-cert\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.135263 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2j4h\" (UniqueName: \"kubernetes.io/projected/f13f8975-f61d-4cf6-8a08-76e4427efada-kube-api-access-j2j4h\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.135292 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-client-ca\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.135339 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-config\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.135361 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-proxy-ca-bundles\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.136492 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-proxy-ca-bundles\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.136882 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-client-ca\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.136989 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13f8975-f61d-4cf6-8a08-76e4427efada-config\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.138478 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f13f8975-f61d-4cf6-8a08-76e4427efada-serving-cert\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.151362 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2j4h\" (UniqueName: \"kubernetes.io/projected/f13f8975-f61d-4cf6-8a08-76e4427efada-kube-api-access-j2j4h\") pod \"controller-manager-c58fc677-rq8vv\" (UID: \"f13f8975-f61d-4cf6-8a08-76e4427efada\") " pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.200964 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.371880 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c58fc677-rq8vv"] Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.746013 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" event={"ID":"f13f8975-f61d-4cf6-8a08-76e4427efada","Type":"ContainerStarted","Data":"a07266500e9f0b537705d2ac1e2e398e522bbc0519fdd50045f683924f5f7c8a"} Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.746299 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" event={"ID":"f13f8975-f61d-4cf6-8a08-76e4427efada","Type":"ContainerStarted","Data":"46802b4bcfebe0b5f4e58a06ce68253a5640e5675bb511d193c95ea139fd61d0"} Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.746733 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.757730 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" Jan 29 16:28:13 crc kubenswrapper[4886]: I0129 16:28:13.798940 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c58fc677-rq8vv" podStartSLOduration=2.798921026 podStartE2EDuration="2.798921026s" podCreationTimestamp="2026-01-29 16:28:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:28:13.783683193 +0000 UTC m=+376.692402465" watchObservedRunningTime="2026-01-29 16:28:13.798921026 +0000 UTC m=+376.707640298" Jan 29 16:28:14 crc kubenswrapper[4886]: I0129 16:28:14.621138 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b68f8a-9483-479e-bf2d-441dff994e02" path="/var/lib/kubelet/pods/e7b68f8a-9483-479e-bf2d-441dff994e02/volumes" Jan 29 16:28:18 crc kubenswrapper[4886]: I0129 16:28:18.511929 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg"] Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.033195 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm"] Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.034792 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.041835 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm"] Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.043883 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.044842 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.045873 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.045988 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.045988 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.112109 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3dd4249-1e33-4000-8cf8-94db106891dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.112164 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a3dd4249-1e33-4000-8cf8-94db106891dc-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.112195 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrd6t\" (UniqueName: \"kubernetes.io/projected/a3dd4249-1e33-4000-8cf8-94db106891dc-kube-api-access-zrd6t\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.213033 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3dd4249-1e33-4000-8cf8-94db106891dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.213080 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a3dd4249-1e33-4000-8cf8-94db106891dc-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.213107 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrd6t\" (UniqueName: \"kubernetes.io/projected/a3dd4249-1e33-4000-8cf8-94db106891dc-kube-api-access-zrd6t\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.214114 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a3dd4249-1e33-4000-8cf8-94db106891dc-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.223046 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3dd4249-1e33-4000-8cf8-94db106891dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.241009 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrd6t\" (UniqueName: \"kubernetes.io/projected/a3dd4249-1e33-4000-8cf8-94db106891dc-kube-api-access-zrd6t\") pod \"cluster-monitoring-operator-6d5b84845-9stfm\" (UID: \"a3dd4249-1e33-4000-8cf8-94db106891dc\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.394425 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" Jan 29 16:28:19 crc kubenswrapper[4886]: E0129 16:28:19.743562 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 16:28:19 crc kubenswrapper[4886]: E0129 16:28:19.743996 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jfv6k_openshift-marketplace(69003a39-1c09-4087-a494-ebfd69e973cf): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:19 crc kubenswrapper[4886]: E0129 16:28:19.745108 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:28:19 crc kubenswrapper[4886]: I0129 16:28:19.795137 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm"] Jan 29 16:28:19 crc kubenswrapper[4886]: W0129 16:28:19.804748 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3dd4249_1e33_4000_8cf8_94db106891dc.slice/crio-48b8b1abf01ec742aef11fe0a5a8be90c6afc747ffbf4755afea7e1865e560d3 WatchSource:0}: Error finding container 48b8b1abf01ec742aef11fe0a5a8be90c6afc747ffbf4755afea7e1865e560d3: Status 404 returned error can't find the container with id 48b8b1abf01ec742aef11fe0a5a8be90c6afc747ffbf4755afea7e1865e560d3 Jan 29 16:28:20 crc kubenswrapper[4886]: I0129 16:28:20.783195 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" event={"ID":"a3dd4249-1e33-4000-8cf8-94db106891dc","Type":"ContainerStarted","Data":"48b8b1abf01ec742aef11fe0a5a8be90c6afc747ffbf4755afea7e1865e560d3"} Jan 29 16:28:21 crc kubenswrapper[4886]: E0129 16:28:21.752395 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 16:28:21 crc kubenswrapper[4886]: E0129 16:28:21.752919 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8jsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q5hs7_openshift-marketplace(a7325ad0-28bf-45e0-bbd5-160f441de091): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:21 crc kubenswrapper[4886]: E0129 16:28:21.754377 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:28:21 crc kubenswrapper[4886]: E0129 16:28:21.765120 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 16:28:21 crc kubenswrapper[4886]: E0129 16:28:21.765316 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn92n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zkk68_openshift-marketplace(d84ce3e9-c41a-4a08-8d86-2a918d5e9450): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:21 crc kubenswrapper[4886]: E0129 16:28:21.766888 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:28:21 crc kubenswrapper[4886]: I0129 16:28:21.789155 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" event={"ID":"a3dd4249-1e33-4000-8cf8-94db106891dc","Type":"ContainerStarted","Data":"40462d71fb2894f86ea4404c51bffe0c125791dad00e65d87f460a224575d876"} Jan 29 16:28:21 crc kubenswrapper[4886]: I0129 16:28:21.810630 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9stfm" podStartSLOduration=1.038990985 podStartE2EDuration="2.810608639s" podCreationTimestamp="2026-01-29 16:28:19 +0000 UTC" firstStartedPulling="2026-01-29 16:28:19.80785382 +0000 UTC m=+382.716573092" lastFinishedPulling="2026-01-29 16:28:21.579471474 +0000 UTC m=+384.488190746" observedRunningTime="2026-01-29 16:28:21.804020943 +0000 UTC m=+384.712740225" watchObservedRunningTime="2026-01-29 16:28:21.810608639 +0000 UTC m=+384.719327911" Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.170271 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d"] Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.171080 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.172913 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.172985 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-7bjfn" Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.179056 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d"] Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.251263 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c050db1b-3854-406d-8cc5-fc997e9a1abe-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfl4d\" (UID: \"c050db1b-3854-406d-8cc5-fc997e9a1abe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.352758 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c050db1b-3854-406d-8cc5-fc997e9a1abe-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfl4d\" (UID: \"c050db1b-3854-406d-8cc5-fc997e9a1abe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.361479 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c050db1b-3854-406d-8cc5-fc997e9a1abe-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfl4d\" (UID: \"c050db1b-3854-406d-8cc5-fc997e9a1abe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.485674 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" Jan 29 16:28:22 crc kubenswrapper[4886]: E0129 16:28:22.789977 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:28:22 crc kubenswrapper[4886]: E0129 16:28:22.790737 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf7sq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4qbl4_openshift-marketplace(57aa9115-b2d5-45aa-8ac3-e251c0907e45): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:22 crc kubenswrapper[4886]: E0129 16:28:22.791914 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:28:22 crc kubenswrapper[4886]: I0129 16:28:22.896986 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d"] Jan 29 16:28:23 crc kubenswrapper[4886]: I0129 16:28:23.806055 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" event={"ID":"c050db1b-3854-406d-8cc5-fc997e9a1abe","Type":"ContainerStarted","Data":"8455383e60539f36cbaa22b285d5315c40a7997e6100f72dc2fa08d6ee382658"} Jan 29 16:28:24 crc kubenswrapper[4886]: I0129 16:28:24.812198 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" event={"ID":"c050db1b-3854-406d-8cc5-fc997e9a1abe","Type":"ContainerStarted","Data":"252860898c2683bc1c12338582f088908b61f2309f0b69654c33a383f7fa819b"} Jan 29 16:28:24 crc kubenswrapper[4886]: I0129 16:28:24.812584 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" Jan 29 16:28:24 crc kubenswrapper[4886]: I0129 16:28:24.817384 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" Jan 29 16:28:24 crc kubenswrapper[4886]: I0129 16:28:24.827993 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfl4d" podStartSLOduration=1.136545306 podStartE2EDuration="2.827978405s" podCreationTimestamp="2026-01-29 16:28:22 +0000 UTC" firstStartedPulling="2026-01-29 16:28:22.906964178 +0000 UTC m=+385.815683460" lastFinishedPulling="2026-01-29 16:28:24.598397287 +0000 UTC m=+387.507116559" observedRunningTime="2026-01-29 16:28:24.827837561 +0000 UTC m=+387.736556833" watchObservedRunningTime="2026-01-29 16:28:24.827978405 +0000 UTC m=+387.736697677" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.241833 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-g77js"] Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.242636 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.244035 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-bpjmt" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.245204 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.245249 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.245518 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.259276 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-g77js"] Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.401877 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/672614ef-138a-405e-a615-b56724368e8f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.402034 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvt8\" (UniqueName: \"kubernetes.io/projected/672614ef-138a-405e-a615-b56724368e8f-kube-api-access-5lvt8\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.402097 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/672614ef-138a-405e-a615-b56724368e8f-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.402147 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/672614ef-138a-405e-a615-b56724368e8f-metrics-client-ca\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.503471 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/672614ef-138a-405e-a615-b56724368e8f-metrics-client-ca\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.503654 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/672614ef-138a-405e-a615-b56724368e8f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.503771 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvt8\" (UniqueName: \"kubernetes.io/projected/672614ef-138a-405e-a615-b56724368e8f-kube-api-access-5lvt8\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.503848 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/672614ef-138a-405e-a615-b56724368e8f-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.505282 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/672614ef-138a-405e-a615-b56724368e8f-metrics-client-ca\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.511734 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/672614ef-138a-405e-a615-b56724368e8f-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.515441 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/672614ef-138a-405e-a615-b56724368e8f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.527132 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvt8\" (UniqueName: \"kubernetes.io/projected/672614ef-138a-405e-a615-b56724368e8f-kube-api-access-5lvt8\") pod \"prometheus-operator-db54df47d-g77js\" (UID: \"672614ef-138a-405e-a615-b56724368e8f\") " pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.557380 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" Jan 29 16:28:25 crc kubenswrapper[4886]: I0129 16:28:25.964872 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-g77js"] Jan 29 16:28:26 crc kubenswrapper[4886]: I0129 16:28:26.822712 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" event={"ID":"672614ef-138a-405e-a615-b56724368e8f","Type":"ContainerStarted","Data":"0ee607a1132785eb0b57178d86676b176f64d1ebd9cc2429a153eeeac5628f4e"} Jan 29 16:28:27 crc kubenswrapper[4886]: I0129 16:28:27.830123 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" event={"ID":"672614ef-138a-405e-a615-b56724368e8f","Type":"ContainerStarted","Data":"1a669f284167fd42f2cf77fd3bad2013a7bf2323b79ec9e9b09f83ee67918217"} Jan 29 16:28:28 crc kubenswrapper[4886]: I0129 16:28:28.836513 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" event={"ID":"672614ef-138a-405e-a615-b56724368e8f","Type":"ContainerStarted","Data":"3bb35df2f59865df8f660eda08260051716912f6ac3e8c1839863b657a15182b"} Jan 29 16:28:28 crc kubenswrapper[4886]: I0129 16:28:28.862670 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-g77js" podStartSLOduration=2.244039582 podStartE2EDuration="3.86264173s" podCreationTimestamp="2026-01-29 16:28:25 +0000 UTC" firstStartedPulling="2026-01-29 16:28:25.972795466 +0000 UTC m=+388.881514738" lastFinishedPulling="2026-01-29 16:28:27.591397604 +0000 UTC m=+390.500116886" observedRunningTime="2026-01-29 16:28:28.858379101 +0000 UTC m=+391.767098383" watchObservedRunningTime="2026-01-29 16:28:28.86264173 +0000 UTC m=+391.771361062" Jan 29 16:28:29 crc kubenswrapper[4886]: I0129 16:28:29.661140 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:28:29 crc kubenswrapper[4886]: I0129 16:28:29.661227 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.551864 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-w4847"] Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.554270 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.566531 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x"] Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.566828 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.567066 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-dxqr2" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.567557 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.568815 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-w4847"] Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.570215 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.570659 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.570967 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.572059 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-v7f66" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.580908 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.585048 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x"] Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.603541 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tsz6m"] Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.604642 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.610651 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.610819 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.610928 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-5rgqh" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.674318 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qktqq\" (UniqueName: \"kubernetes.io/projected/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-api-access-qktqq\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.674588 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5d78538-806d-458c-ae3c-4ac03596fe18-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.674689 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.674773 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fc5b733-9271-4576-b06b-f6bece792d8a-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.674847 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5d78538-806d-458c-ae3c-4ac03596fe18-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.674941 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.675057 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fc5b733-9271-4576-b06b-f6bece792d8a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.675143 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fc5b733-9271-4576-b06b-f6bece792d8a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.675223 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.675317 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mw4\" (UniqueName: \"kubernetes.io/projected/8fc5b733-9271-4576-b06b-f6bece792d8a-kube-api-access-d4mw4\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.776883 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.777144 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-textfile\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.777293 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86cef950-d7b4-468c-bb9f-e71a98ffe676-metrics-client-ca\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.777449 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fc5b733-9271-4576-b06b-f6bece792d8a-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.777610 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5d78538-806d-458c-ae3c-4ac03596fe18-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.777752 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-wtmp\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.777867 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.777973 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4c9\" (UniqueName: \"kubernetes.io/projected/86cef950-d7b4-468c-bb9f-e71a98ffe676-kube-api-access-vd4c9\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778078 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778163 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-sys\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778248 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fc5b733-9271-4576-b06b-f6bece792d8a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778318 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-root\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778457 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-tls\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778562 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fc5b733-9271-4576-b06b-f6bece792d8a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778652 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778754 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mw4\" (UniqueName: \"kubernetes.io/projected/8fc5b733-9271-4576-b06b-f6bece792d8a-kube-api-access-d4mw4\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778915 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qktqq\" (UniqueName: \"kubernetes.io/projected/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-api-access-qktqq\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.779028 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5d78538-806d-458c-ae3c-4ac03596fe18-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.779049 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5d78538-806d-458c-ae3c-4ac03596fe18-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.778379 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fc5b733-9271-4576-b06b-f6bece792d8a-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.779569 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5d78538-806d-458c-ae3c-4ac03596fe18-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.779753 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.783764 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fc5b733-9271-4576-b06b-f6bece792d8a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.783921 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fc5b733-9271-4576-b06b-f6bece792d8a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.784564 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.784568 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.797978 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mw4\" (UniqueName: \"kubernetes.io/projected/8fc5b733-9271-4576-b06b-f6bece792d8a-kube-api-access-d4mw4\") pod \"openshift-state-metrics-566fddb674-w4847\" (UID: \"8fc5b733-9271-4576-b06b-f6bece792d8a\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.799504 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qktqq\" (UniqueName: \"kubernetes.io/projected/a5d78538-806d-458c-ae3c-4ac03596fe18-kube-api-access-qktqq\") pod \"kube-state-metrics-777cb5bd5d-28t5x\" (UID: \"a5d78538-806d-458c-ae3c-4ac03596fe18\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.880516 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.880887 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-textfile\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.880925 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86cef950-d7b4-468c-bb9f-e71a98ffe676-metrics-client-ca\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.880950 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-wtmp\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.880967 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4c9\" (UniqueName: \"kubernetes.io/projected/86cef950-d7b4-468c-bb9f-e71a98ffe676-kube-api-access-vd4c9\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.880987 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-sys\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.881003 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.881022 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-root\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.881042 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-tls\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.881341 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-root\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.881377 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-sys\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.881508 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-wtmp\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.882122 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-textfile\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.882462 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86cef950-d7b4-468c-bb9f-e71a98ffe676-metrics-client-ca\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.885969 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-tls\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.886783 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86cef950-d7b4-468c-bb9f-e71a98ffe676-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.895766 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.907298 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4c9\" (UniqueName: \"kubernetes.io/projected/86cef950-d7b4-468c-bb9f-e71a98ffe676-kube-api-access-vd4c9\") pod \"node-exporter-tsz6m\" (UID: \"86cef950-d7b4-468c-bb9f-e71a98ffe676\") " pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: I0129 16:28:30.935168 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tsz6m" Jan 29 16:28:30 crc kubenswrapper[4886]: W0129 16:28:30.959781 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86cef950_d7b4_468c_bb9f_e71a98ffe676.slice/crio-f2ccfa8ffd6d77641522959a801d9126e80a9c79315d6bf26f7ce89ec7e4b511 WatchSource:0}: Error finding container f2ccfa8ffd6d77641522959a801d9126e80a9c79315d6bf26f7ce89ec7e4b511: Status 404 returned error can't find the container with id f2ccfa8ffd6d77641522959a801d9126e80a9c79315d6bf26f7ce89ec7e4b511 Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.286186 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-w4847"] Jan 29 16:28:31 crc kubenswrapper[4886]: W0129 16:28:31.292444 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fc5b733_9271_4576_b06b_f6bece792d8a.slice/crio-35335d098d1e6004276f92ee90f008ab46cdd56e260d8b8c5af8ae31745dec40 WatchSource:0}: Error finding container 35335d098d1e6004276f92ee90f008ab46cdd56e260d8b8c5af8ae31745dec40: Status 404 returned error can't find the container with id 35335d098d1e6004276f92ee90f008ab46cdd56e260d8b8c5af8ae31745dec40 Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.365721 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x"] Jan 29 16:28:31 crc kubenswrapper[4886]: W0129 16:28:31.369255 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5d78538_806d_458c_ae3c_4ac03596fe18.slice/crio-037e2449384b22b4b812bba703eed3b9414e27a7f858d877a1204e0f2a303e0b WatchSource:0}: Error finding container 037e2449384b22b4b812bba703eed3b9414e27a7f858d877a1204e0f2a303e0b: Status 404 returned error can't find the container with id 037e2449384b22b4b812bba703eed3b9414e27a7f858d877a1204e0f2a303e0b Jan 29 16:28:31 crc kubenswrapper[4886]: E0129 16:28:31.617433 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.665162 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.667401 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.680596 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.680636 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.680674 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-chnnp" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.680684 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.683538 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.683629 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.684361 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.684561 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690343 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690625 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690651 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-config-out\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690685 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690705 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690786 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690860 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-config-volume\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690902 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.690926 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4ql\" (UniqueName: \"kubernetes.io/projected/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-kube-api-access-7h4ql\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.691164 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-web-config\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.691191 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.691210 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.691230 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.691254 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.791988 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792072 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-config-out\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792109 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792140 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792164 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792188 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-config-volume\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792212 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792236 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4ql\" (UniqueName: \"kubernetes.io/projected/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-kube-api-access-7h4ql\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792306 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-web-config\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792353 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792379 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.792412 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.796813 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.797225 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.798159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.798162 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-config-out\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.798272 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.799386 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.799750 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-web-config\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.802078 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.802607 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-config-volume\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.803768 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.803982 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.816699 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4ql\" (UniqueName: \"kubernetes.io/projected/43bcb21d-ccb0-474a-8a4b-20c4fd56904a-kube-api-access-7h4ql\") pod \"alertmanager-main-0\" (UID: \"43bcb21d-ccb0-474a-8a4b-20c4fd56904a\") " pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.857000 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tsz6m" event={"ID":"86cef950-d7b4-468c-bb9f-e71a98ffe676","Type":"ContainerStarted","Data":"f2ccfa8ffd6d77641522959a801d9126e80a9c79315d6bf26f7ce89ec7e4b511"} Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.860550 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" event={"ID":"8fc5b733-9271-4576-b06b-f6bece792d8a","Type":"ContainerStarted","Data":"c6c007ce7d14dad969f24874130472966dedd8ff15d70b4ce278565fb9dacdc4"} Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.860604 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" event={"ID":"8fc5b733-9271-4576-b06b-f6bece792d8a","Type":"ContainerStarted","Data":"e6e7efec676ad7f430a5071703c136559743f22c02db645476e614c951695f5d"} Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.860618 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" event={"ID":"8fc5b733-9271-4576-b06b-f6bece792d8a","Type":"ContainerStarted","Data":"35335d098d1e6004276f92ee90f008ab46cdd56e260d8b8c5af8ae31745dec40"} Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.861597 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" event={"ID":"a5d78538-806d-458c-ae3c-4ac03596fe18","Type":"ContainerStarted","Data":"037e2449384b22b4b812bba703eed3b9414e27a7f858d877a1204e0f2a303e0b"} Jan 29 16:28:31 crc kubenswrapper[4886]: I0129 16:28:31.984115 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.739880 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-645496d5c-x86sq"] Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.741800 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-645496d5c-x86sq"] Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.741884 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.748835 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-7l1p2e2gr0th4" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.748857 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.749029 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.749402 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.749523 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.749754 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.749770 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-p77fn" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.798177 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 29 16:28:32 crc kubenswrapper[4886]: E0129 16:28:32.907645 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.930454 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-grpc-tls\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.930614 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-metrics-client-ca\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.930679 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.931927 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8pfg\" (UniqueName: \"kubernetes.io/projected/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-kube-api-access-c8pfg\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.932026 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-tls\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.932136 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.932166 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:32 crc kubenswrapper[4886]: I0129 16:28:32.932226 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.033710 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-tls\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.033799 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.033846 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.033883 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.033962 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-grpc-tls\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.034174 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-metrics-client-ca\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.034787 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.034816 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8pfg\" (UniqueName: \"kubernetes.io/projected/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-kube-api-access-c8pfg\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.035054 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-metrics-client-ca\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.039312 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-grpc-tls\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.039502 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.039561 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.040145 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.040474 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-tls\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.048983 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.063836 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8pfg\" (UniqueName: \"kubernetes.io/projected/cf13b56e-deb1-4a2d-8d41-139db9eb5dbe-kube-api-access-c8pfg\") pod \"thanos-querier-645496d5c-x86sq\" (UID: \"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe\") " pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.066703 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:33 crc kubenswrapper[4886]: E0129 16:28:33.616099 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:28:33 crc kubenswrapper[4886]: E0129 16:28:33.616870 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.744681 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-645496d5c-x86sq"] Jan 29 16:28:33 crc kubenswrapper[4886]: W0129 16:28:33.762768 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf13b56e_deb1_4a2d_8d41_139db9eb5dbe.slice/crio-277767fe6c1e80f59467b3a2a2d85eb485fd3daaee3c03fe69cf330fdf2d3f9e WatchSource:0}: Error finding container 277767fe6c1e80f59467b3a2a2d85eb485fd3daaee3c03fe69cf330fdf2d3f9e: Status 404 returned error can't find the container with id 277767fe6c1e80f59467b3a2a2d85eb485fd3daaee3c03fe69cf330fdf2d3f9e Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.875344 4886 generic.go:334] "Generic (PLEG): container finished" podID="86cef950-d7b4-468c-bb9f-e71a98ffe676" containerID="35c0a7c8171d777037ab6ba9c183894a9159683145da271a51441c28f2fd717b" exitCode=0 Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.875410 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tsz6m" event={"ID":"86cef950-d7b4-468c-bb9f-e71a98ffe676","Type":"ContainerDied","Data":"35c0a7c8171d777037ab6ba9c183894a9159683145da271a51441c28f2fd717b"} Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.881554 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43bcb21d-ccb0-474a-8a4b-20c4fd56904a","Type":"ContainerStarted","Data":"c4f9c342c319c4a8afe12a37991cc1ceb0e97ac29ad4f77ada690f6b56230195"} Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.895269 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" event={"ID":"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe","Type":"ContainerStarted","Data":"277767fe6c1e80f59467b3a2a2d85eb485fd3daaee3c03fe69cf330fdf2d3f9e"} Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.898990 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" event={"ID":"8fc5b733-9271-4576-b06b-f6bece792d8a","Type":"ContainerStarted","Data":"aef040dc6b9566809fec01427ce7669ca9c5a316a48d7858f1b99d2e2d5aeac1"} Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.902055 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" event={"ID":"a5d78538-806d-458c-ae3c-4ac03596fe18","Type":"ContainerStarted","Data":"0c853deda8ba1f4aae5c528542ebfb9161a926cce83619aeaaff27f2ffc6e02b"} Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.902088 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" event={"ID":"a5d78538-806d-458c-ae3c-4ac03596fe18","Type":"ContainerStarted","Data":"2fd57ba88d837f53e8ff09ec3605ec748ad76981bc58792c786485e48a3c66f4"} Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.902102 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" event={"ID":"a5d78538-806d-458c-ae3c-4ac03596fe18","Type":"ContainerStarted","Data":"394c70b2ba5552c56c03afe6e4fd4ee92b9edc2b2bd22a48af44e7f66c6b7115"} Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.917092 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w4847" podStartSLOduration=2.120780124 podStartE2EDuration="3.917061893s" podCreationTimestamp="2026-01-29 16:28:30 +0000 UTC" firstStartedPulling="2026-01-29 16:28:31.630084258 +0000 UTC m=+394.538803530" lastFinishedPulling="2026-01-29 16:28:33.426366027 +0000 UTC m=+396.335085299" observedRunningTime="2026-01-29 16:28:33.914396709 +0000 UTC m=+396.823116001" watchObservedRunningTime="2026-01-29 16:28:33.917061893 +0000 UTC m=+396.825781165" Jan 29 16:28:33 crc kubenswrapper[4886]: I0129 16:28:33.940156 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-28t5x" podStartSLOduration=1.996917189 podStartE2EDuration="3.940116444s" podCreationTimestamp="2026-01-29 16:28:30 +0000 UTC" firstStartedPulling="2026-01-29 16:28:31.371770324 +0000 UTC m=+394.280489596" lastFinishedPulling="2026-01-29 16:28:33.314969579 +0000 UTC m=+396.223688851" observedRunningTime="2026-01-29 16:28:33.935307501 +0000 UTC m=+396.844026793" watchObservedRunningTime="2026-01-29 16:28:33.940116444 +0000 UTC m=+396.848835736" Jan 29 16:28:34 crc kubenswrapper[4886]: I0129 16:28:34.911503 4886 generic.go:334] "Generic (PLEG): container finished" podID="43bcb21d-ccb0-474a-8a4b-20c4fd56904a" containerID="c2820ceebc223c1c65aa306b7a718275e55032ebaacfee4f9a51773b5b4cbd79" exitCode=0 Jan 29 16:28:34 crc kubenswrapper[4886]: I0129 16:28:34.911652 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43bcb21d-ccb0-474a-8a4b-20c4fd56904a","Type":"ContainerDied","Data":"c2820ceebc223c1c65aa306b7a718275e55032ebaacfee4f9a51773b5b4cbd79"} Jan 29 16:28:34 crc kubenswrapper[4886]: I0129 16:28:34.914913 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tsz6m" event={"ID":"86cef950-d7b4-468c-bb9f-e71a98ffe676","Type":"ContainerStarted","Data":"c32966a40b7f6cae2b96c8b36b42bffb0cd7e95653016c4bbfe44e4392146547"} Jan 29 16:28:34 crc kubenswrapper[4886]: I0129 16:28:34.915049 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tsz6m" event={"ID":"86cef950-d7b4-468c-bb9f-e71a98ffe676","Type":"ContainerStarted","Data":"20300c24bdc1beb8993563d477c2cef1160392a91272f3b2ac54e0c098dc63c3"} Jan 29 16:28:34 crc kubenswrapper[4886]: I0129 16:28:34.970357 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tsz6m" podStartSLOduration=3.005660485 podStartE2EDuration="4.970334067s" podCreationTimestamp="2026-01-29 16:28:30 +0000 UTC" firstStartedPulling="2026-01-29 16:28:30.963139889 +0000 UTC m=+393.871859161" lastFinishedPulling="2026-01-29 16:28:32.927813471 +0000 UTC m=+395.836532743" observedRunningTime="2026-01-29 16:28:34.964663809 +0000 UTC m=+397.873383091" watchObservedRunningTime="2026-01-29 16:28:34.970334067 +0000 UTC m=+397.879053349" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.394303 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54754b854f-fgkbk"] Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.395009 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.415361 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54754b854f-fgkbk"] Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.573702 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-oauth-config\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.573821 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-serving-cert\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.573874 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-service-ca\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.574010 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-oauth-serving-cert\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.574063 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdgg\" (UniqueName: \"kubernetes.io/projected/56fe8de1-76b0-42ad-9f62-53ac51eac78d-kube-api-access-hqdgg\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.574095 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-config\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.574151 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-trusted-ca-bundle\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.675807 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-trusted-ca-bundle\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.675895 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-oauth-config\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.675927 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-serving-cert\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.675946 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-service-ca\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.675968 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-oauth-serving-cert\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.675983 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdgg\" (UniqueName: \"kubernetes.io/projected/56fe8de1-76b0-42ad-9f62-53ac51eac78d-kube-api-access-hqdgg\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.676002 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-config\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.676943 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-trusted-ca-bundle\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.676965 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-config\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.677572 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-oauth-serving-cert\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.679356 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-service-ca\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.681965 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-oauth-config\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.685647 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-serving-cert\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.692069 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdgg\" (UniqueName: \"kubernetes.io/projected/56fe8de1-76b0-42ad-9f62-53ac51eac78d-kube-api-access-hqdgg\") pod \"console-54754b854f-fgkbk\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.745561 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.885760 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-75f86dc845-cd7l9"] Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.887180 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.891509 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-w5r7w" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.891864 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.892127 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-8v90ublngch0f" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.892376 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.892580 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.892782 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.899930 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-75f86dc845-cd7l9"] Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.983009 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/768b9eb6-0280-46a3-a61a-295bd94524a5-metrics-server-audit-profiles\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.983062 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xrw\" (UniqueName: \"kubernetes.io/projected/768b9eb6-0280-46a3-a61a-295bd94524a5-kube-api-access-p4xrw\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.983088 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/768b9eb6-0280-46a3-a61a-295bd94524a5-audit-log\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.983118 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-secret-metrics-client-certs\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.983464 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-secret-metrics-server-tls\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.983573 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768b9eb6-0280-46a3-a61a-295bd94524a5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:35 crc kubenswrapper[4886]: I0129 16:28:35.983715 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-client-ca-bundle\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.078952 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2gkn5"] Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.079891 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.087050 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/768b9eb6-0280-46a3-a61a-295bd94524a5-metrics-server-audit-profiles\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.087167 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xrw\" (UniqueName: \"kubernetes.io/projected/768b9eb6-0280-46a3-a61a-295bd94524a5-kube-api-access-p4xrw\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.087252 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/768b9eb6-0280-46a3-a61a-295bd94524a5-audit-log\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.087374 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-secret-metrics-client-certs\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.087486 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-secret-metrics-server-tls\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.087526 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768b9eb6-0280-46a3-a61a-295bd94524a5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.087612 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-client-ca-bundle\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.089661 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768b9eb6-0280-46a3-a61a-295bd94524a5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.090286 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/768b9eb6-0280-46a3-a61a-295bd94524a5-audit-log\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.095180 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-secret-metrics-server-tls\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.095251 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-secret-metrics-client-certs\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.095997 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768b9eb6-0280-46a3-a61a-295bd94524a5-client-ca-bundle\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.099164 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2gkn5"] Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.100509 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/768b9eb6-0280-46a3-a61a-295bd94524a5-metrics-server-audit-profiles\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.125620 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xrw\" (UniqueName: \"kubernetes.io/projected/768b9eb6-0280-46a3-a61a-295bd94524a5-kube-api-access-p4xrw\") pod \"metrics-server-75f86dc845-cd7l9\" (UID: \"768b9eb6-0280-46a3-a61a-295bd94524a5\") " pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.189358 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-registry-tls\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.189417 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-registry-certificates\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.189455 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-bound-sa-token\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.189497 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.189523 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-trusted-ca\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.189561 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.189597 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvh4t\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-kube-api-access-bvh4t\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.189647 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.210261 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.216588 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.291433 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-registry-tls\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.291481 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-registry-certificates\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.291520 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-bound-sa-token\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.291573 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.291635 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-trusted-ca\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.292914 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvh4t\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-kube-api-access-bvh4t\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.292984 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.293128 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-registry-certificates\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.293471 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.293739 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-trusted-ca\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.302577 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-registry-tls\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.313185 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-bound-sa-token\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.314150 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.317903 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvh4t\" (UniqueName: \"kubernetes.io/projected/80d1fdd6-c3ce-47c5-8a0f-4266880adb73-kube-api-access-bvh4t\") pod \"image-registry-66df7c8f76-2gkn5\" (UID: \"80d1fdd6-c3ce-47c5-8a0f-4266880adb73\") " pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.375030 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6466f85649-t8mxw"] Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.378442 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.383398 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.384393 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.391597 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6466f85649-t8mxw"] Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.430101 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.499367 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9eaba21-71aa-42b0-a4dd-f46aeeb38d75-monitoring-plugin-cert\") pod \"monitoring-plugin-6466f85649-t8mxw\" (UID: \"b9eaba21-71aa-42b0-a4dd-f46aeeb38d75\") " pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.600893 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9eaba21-71aa-42b0-a4dd-f46aeeb38d75-monitoring-plugin-cert\") pod \"monitoring-plugin-6466f85649-t8mxw\" (UID: \"b9eaba21-71aa-42b0-a4dd-f46aeeb38d75\") " pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.606622 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9eaba21-71aa-42b0-a4dd-f46aeeb38d75-monitoring-plugin-cert\") pod \"monitoring-plugin-6466f85649-t8mxw\" (UID: \"b9eaba21-71aa-42b0-a4dd-f46aeeb38d75\") " pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.652477 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54754b854f-fgkbk"] Jan 29 16:28:36 crc kubenswrapper[4886]: W0129 16:28:36.659628 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fe8de1_76b0_42ad_9f62_53ac51eac78d.slice/crio-92457371ca67ffbaa6957a21cf77005c4601275089a8ad1b5d44bb6186c2a4ce WatchSource:0}: Error finding container 92457371ca67ffbaa6957a21cf77005c4601275089a8ad1b5d44bb6186c2a4ce: Status 404 returned error can't find the container with id 92457371ca67ffbaa6957a21cf77005c4601275089a8ad1b5d44bb6186c2a4ce Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.704823 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.725373 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-75f86dc845-cd7l9"] Jan 29 16:28:36 crc kubenswrapper[4886]: W0129 16:28:36.738526 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod768b9eb6_0280_46a3_a61a_295bd94524a5.slice/crio-4eaa82eb79542e700a3dc1ebd54a2baf71ac12de9a804666daeb51f3971a6fbe WatchSource:0}: Error finding container 4eaa82eb79542e700a3dc1ebd54a2baf71ac12de9a804666daeb51f3971a6fbe: Status 404 returned error can't find the container with id 4eaa82eb79542e700a3dc1ebd54a2baf71ac12de9a804666daeb51f3971a6fbe Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.852694 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2gkn5"] Jan 29 16:28:36 crc kubenswrapper[4886]: W0129 16:28:36.855469 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80d1fdd6_c3ce_47c5_8a0f_4266880adb73.slice/crio-90e4fcdc257a1de4944f96f97343d73670d7eee11293b0d101d72b9536d01b39 WatchSource:0}: Error finding container 90e4fcdc257a1de4944f96f97343d73670d7eee11293b0d101d72b9536d01b39: Status 404 returned error can't find the container with id 90e4fcdc257a1de4944f96f97343d73670d7eee11293b0d101d72b9536d01b39 Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.941761 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54754b854f-fgkbk" event={"ID":"56fe8de1-76b0-42ad-9f62-53ac51eac78d","Type":"ContainerStarted","Data":"92457371ca67ffbaa6957a21cf77005c4601275089a8ad1b5d44bb6186c2a4ce"} Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.945161 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" event={"ID":"80d1fdd6-c3ce-47c5-8a0f-4266880adb73","Type":"ContainerStarted","Data":"90e4fcdc257a1de4944f96f97343d73670d7eee11293b0d101d72b9536d01b39"} Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.956009 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" event={"ID":"768b9eb6-0280-46a3-a61a-295bd94524a5","Type":"ContainerStarted","Data":"4eaa82eb79542e700a3dc1ebd54a2baf71ac12de9a804666daeb51f3971a6fbe"} Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.960013 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" event={"ID":"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe","Type":"ContainerStarted","Data":"5a79d99b5f579a92970a818ef0dffd3600c07cc8151dbc6cc7e2b9555f8bee95"} Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.978413 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.981476 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.986308 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.986687 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.993338 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.993416 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.993476 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.993618 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-p5vdv" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.993718 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.993882 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.994039 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.994165 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4l5e9npcpeq8g" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.994381 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Jan 29 16:28:36 crc kubenswrapper[4886]: I0129 16:28:36.995015 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.000074 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.007710 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.095145 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6466f85649-t8mxw"] Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.112752 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.112816 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.112848 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.112869 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.112892 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.112915 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.112957 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-config\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113004 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113042 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113073 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-web-config\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113099 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lp8\" (UniqueName: \"kubernetes.io/projected/48feb470-6d6f-4fa2-a419-40698fb3a20a-kube-api-access-48lp8\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113128 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48feb470-6d6f-4fa2-a419-40698fb3a20a-config-out\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113148 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113167 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113189 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48feb470-6d6f-4fa2-a419-40698fb3a20a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113220 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113244 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.113270 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.214965 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48feb470-6d6f-4fa2-a419-40698fb3a20a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215066 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215107 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215168 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215205 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215234 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215268 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215285 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215304 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215319 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215422 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-config\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215446 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215475 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215506 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-web-config\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215530 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lp8\" (UniqueName: \"kubernetes.io/projected/48feb470-6d6f-4fa2-a419-40698fb3a20a-kube-api-access-48lp8\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215549 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48feb470-6d6f-4fa2-a419-40698fb3a20a-config-out\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215570 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.215590 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.216490 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.217306 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.217483 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.218066 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.222683 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.222701 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-config\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.222979 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.223030 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.223378 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48feb470-6d6f-4fa2-a419-40698fb3a20a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.223874 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.224950 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.225945 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.226153 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.226443 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48feb470-6d6f-4fa2-a419-40698fb3a20a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.227221 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48feb470-6d6f-4fa2-a419-40698fb3a20a-config-out\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.231088 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.232156 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48feb470-6d6f-4fa2-a419-40698fb3a20a-web-config\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.234065 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lp8\" (UniqueName: \"kubernetes.io/projected/48feb470-6d6f-4fa2-a419-40698fb3a20a-kube-api-access-48lp8\") pod \"prometheus-k8s-0\" (UID: \"48feb470-6d6f-4fa2-a419-40698fb3a20a\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.312794 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.757651 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.967673 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54754b854f-fgkbk" event={"ID":"56fe8de1-76b0-42ad-9f62-53ac51eac78d","Type":"ContainerStarted","Data":"912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790"} Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.969019 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" event={"ID":"80d1fdd6-c3ce-47c5-8a0f-4266880adb73","Type":"ContainerStarted","Data":"f7f21339e8b5d9e979f032ac68d1f691b895f9169eb17316a17d9e74f3a087d8"} Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.969091 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.970216 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" event={"ID":"b9eaba21-71aa-42b0-a4dd-f46aeeb38d75","Type":"ContainerStarted","Data":"474fbaf9ec6360367c8c7de16802779bff73d18a04ea0a7363497a40275621fd"} Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.972551 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" event={"ID":"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe","Type":"ContainerStarted","Data":"c8261013a52374865a926bb30817ba8c7e1d18820a123b54291a565eaf202a50"} Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.972589 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" event={"ID":"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe","Type":"ContainerStarted","Data":"48fdba91188bffbdcc4503011cb66c2b6eb969cb39aa5279f885e11ef60b2240"} Jan 29 16:28:37 crc kubenswrapper[4886]: I0129 16:28:37.989013 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54754b854f-fgkbk" podStartSLOduration=2.988994392 podStartE2EDuration="2.988994392s" podCreationTimestamp="2026-01-29 16:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:28:37.983711065 +0000 UTC m=+400.892430347" watchObservedRunningTime="2026-01-29 16:28:37.988994392 +0000 UTC m=+400.897713664" Jan 29 16:28:38 crc kubenswrapper[4886]: I0129 16:28:38.006164 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" podStartSLOduration=2.006141339 podStartE2EDuration="2.006141339s" podCreationTimestamp="2026-01-29 16:28:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:28:38.002526779 +0000 UTC m=+400.911246071" watchObservedRunningTime="2026-01-29 16:28:38.006141339 +0000 UTC m=+400.914860621" Jan 29 16:28:38 crc kubenswrapper[4886]: W0129 16:28:38.013603 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48feb470_6d6f_4fa2_a419_40698fb3a20a.slice/crio-98d7b8f912eb792e2359b20d73cde0c761ce41249ed31c81319b81160a79c2be WatchSource:0}: Error finding container 98d7b8f912eb792e2359b20d73cde0c761ce41249ed31c81319b81160a79c2be: Status 404 returned error can't find the container with id 98d7b8f912eb792e2359b20d73cde0c761ce41249ed31c81319b81160a79c2be Jan 29 16:28:38 crc kubenswrapper[4886]: I0129 16:28:38.979502 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48feb470-6d6f-4fa2-a419-40698fb3a20a","Type":"ContainerStarted","Data":"98d7b8f912eb792e2359b20d73cde0c761ce41249ed31c81319b81160a79c2be"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.985908 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" event={"ID":"768b9eb6-0280-46a3-a61a-295bd94524a5","Type":"ContainerStarted","Data":"aa327d0ed65b7a7a6d9d1efaed64b62b4582738e8bb6568c032b576d5049498c"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.988067 4886 generic.go:334] "Generic (PLEG): container finished" podID="48feb470-6d6f-4fa2-a419-40698fb3a20a" containerID="864465bfce45690db67ec09c51f7784b7586fd57663ec14d23d55a7249531051" exitCode=0 Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.988103 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48feb470-6d6f-4fa2-a419-40698fb3a20a","Type":"ContainerDied","Data":"864465bfce45690db67ec09c51f7784b7586fd57663ec14d23d55a7249531051"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.991759 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" event={"ID":"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe","Type":"ContainerStarted","Data":"9dfa9660a643ebfef6fb6beb437138e9d0be1d72066ba292990b917c1f8251b9"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.991779 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" event={"ID":"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe","Type":"ContainerStarted","Data":"e73536c76aabc41333f27b78607e5061d2acbf3535d03e3dfc24c84ead509204"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.991788 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" event={"ID":"cf13b56e-deb1-4a2d-8d41-139db9eb5dbe","Type":"ContainerStarted","Data":"732aa0bd94ae9bf234c8fa99419b6a18949c973b4a3390e3f424b4e36d62abc0"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.991893 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.994426 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43bcb21d-ccb0-474a-8a4b-20c4fd56904a","Type":"ContainerStarted","Data":"b2929ed4083976446ae3936ef7983e104428f4ba48e0d75e9ebd94b28f258fa0"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.994444 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43bcb21d-ccb0-474a-8a4b-20c4fd56904a","Type":"ContainerStarted","Data":"71d73030becd37dcc45fe94e957c2d80ece7a3e663ac5116bdb57b61dbf19409"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.994452 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43bcb21d-ccb0-474a-8a4b-20c4fd56904a","Type":"ContainerStarted","Data":"cd0fc80222db14407c42a547cf952b46e9ea7abe07369fd9d2acdac3ecdb7eb1"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.994461 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43bcb21d-ccb0-474a-8a4b-20c4fd56904a","Type":"ContainerStarted","Data":"845da9b1e1e98cb54466ef3261f8816bac1e9ca66a544ea922fb46b70c038ae7"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.998395 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" event={"ID":"b9eaba21-71aa-42b0-a4dd-f46aeeb38d75","Type":"ContainerStarted","Data":"bcd724ab6af38175b3778fb8df83bd9edb06dcb003618a18fd21213fb0ce461b"} Jan 29 16:28:39 crc kubenswrapper[4886]: I0129 16:28:39.998645 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" Jan 29 16:28:40 crc kubenswrapper[4886]: I0129 16:28:40.001315 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" podStartSLOduration=2.506843023 podStartE2EDuration="5.001302939s" podCreationTimestamp="2026-01-29 16:28:35 +0000 UTC" firstStartedPulling="2026-01-29 16:28:36.746654761 +0000 UTC m=+399.655374033" lastFinishedPulling="2026-01-29 16:28:39.241114677 +0000 UTC m=+402.149833949" observedRunningTime="2026-01-29 16:28:40.000837276 +0000 UTC m=+402.909556558" watchObservedRunningTime="2026-01-29 16:28:40.001302939 +0000 UTC m=+402.910022211" Jan 29 16:28:40 crc kubenswrapper[4886]: I0129 16:28:40.026370 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" podStartSLOduration=1.881509554 podStartE2EDuration="4.026356296s" podCreationTimestamp="2026-01-29 16:28:36 +0000 UTC" firstStartedPulling="2026-01-29 16:28:37.104206475 +0000 UTC m=+400.012925747" lastFinishedPulling="2026-01-29 16:28:39.249053217 +0000 UTC m=+402.157772489" observedRunningTime="2026-01-29 16:28:40.024486804 +0000 UTC m=+402.933206076" watchObservedRunningTime="2026-01-29 16:28:40.026356296 +0000 UTC m=+402.935075568" Jan 29 16:28:40 crc kubenswrapper[4886]: I0129 16:28:40.035781 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6466f85649-t8mxw" Jan 29 16:28:40 crc kubenswrapper[4886]: I0129 16:28:40.094717 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" podStartSLOduration=2.604295997 podStartE2EDuration="8.094698197s" podCreationTimestamp="2026-01-29 16:28:32 +0000 UTC" firstStartedPulling="2026-01-29 16:28:33.767759521 +0000 UTC m=+396.676478793" lastFinishedPulling="2026-01-29 16:28:39.258161691 +0000 UTC m=+402.166880993" observedRunningTime="2026-01-29 16:28:40.092101175 +0000 UTC m=+403.000820447" watchObservedRunningTime="2026-01-29 16:28:40.094698197 +0000 UTC m=+403.003417459" Jan 29 16:28:41 crc kubenswrapper[4886]: I0129 16:28:41.010369 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43bcb21d-ccb0-474a-8a4b-20c4fd56904a","Type":"ContainerStarted","Data":"a6a5f55de3dd19fea82b6bfe910ade259dfcbbfbf8e6492e7415f723d5cb1b9d"} Jan 29 16:28:41 crc kubenswrapper[4886]: I0129 16:28:41.010431 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43bcb21d-ccb0-474a-8a4b-20c4fd56904a","Type":"ContainerStarted","Data":"e436e36913e98932295a1cded5e80be6f060628efe1ccaff4e27d5666b162782"} Jan 29 16:28:41 crc kubenswrapper[4886]: I0129 16:28:41.020425 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-645496d5c-x86sq" Jan 29 16:28:41 crc kubenswrapper[4886]: I0129 16:28:41.044397 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.7912786670000003 podStartE2EDuration="10.044369038s" podCreationTimestamp="2026-01-29 16:28:31 +0000 UTC" firstStartedPulling="2026-01-29 16:28:32.921014372 +0000 UTC m=+395.829733644" lastFinishedPulling="2026-01-29 16:28:39.174104743 +0000 UTC m=+402.082824015" observedRunningTime="2026-01-29 16:28:41.036010686 +0000 UTC m=+403.944729978" watchObservedRunningTime="2026-01-29 16:28:41.044369038 +0000 UTC m=+403.953088330" Jan 29 16:28:43 crc kubenswrapper[4886]: I0129 16:28:43.542468 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" podUID="92af746d-c60d-46a4-9be0-0ad28882ac0e" containerName="oauth-openshift" containerID="cri-o://47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e" gracePeriod=15 Jan 29 16:28:43 crc kubenswrapper[4886]: E0129 16:28:43.766024 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 16:28:43 crc kubenswrapper[4886]: E0129 16:28:43.766518 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8jsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q5hs7_openshift-marketplace(a7325ad0-28bf-45e0-bbd5-160f441de091): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:43 crc kubenswrapper[4886]: E0129 16:28:43.767692 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:28:43 crc kubenswrapper[4886]: I0129 16:28:43.983887 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.011510 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-c659c4677-kmlgq"] Jan 29 16:28:44 crc kubenswrapper[4886]: E0129 16:28:44.011792 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92af746d-c60d-46a4-9be0-0ad28882ac0e" containerName="oauth-openshift" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.011805 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="92af746d-c60d-46a4-9be0-0ad28882ac0e" containerName="oauth-openshift" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.011952 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="92af746d-c60d-46a4-9be0-0ad28882ac0e" containerName="oauth-openshift" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.015668 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.031194 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c659c4677-kmlgq"] Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.040766 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48feb470-6d6f-4fa2-a419-40698fb3a20a","Type":"ContainerStarted","Data":"f45282d692ecff56ac6b45257f4526bfbc95301c27c148001ac177998831b5c8"} Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.041058 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48feb470-6d6f-4fa2-a419-40698fb3a20a","Type":"ContainerStarted","Data":"c0739d6b176c7f54f83d7375bf40b37046368ae1be9d8b805d819e6fdd043b90"} Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.041193 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48feb470-6d6f-4fa2-a419-40698fb3a20a","Type":"ContainerStarted","Data":"23d39dacd0d34217d4ec721d08220e5c4de0e967c2110943736e189bf8a9483a"} Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.041390 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48feb470-6d6f-4fa2-a419-40698fb3a20a","Type":"ContainerStarted","Data":"1aa73a87069bdb664b356c00ae9795fc1f033de4a55964e20bebd5ab17ebf38d"} Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.041522 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48feb470-6d6f-4fa2-a419-40698fb3a20a","Type":"ContainerStarted","Data":"9f001fc76e705569bff9b276d6f937c9a6baae37918b2c313582cc886869c062"} Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.041653 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48feb470-6d6f-4fa2-a419-40698fb3a20a","Type":"ContainerStarted","Data":"5f71b5a438496a8654fa0cf90e3ff3023f14b629a79d9d8b70ad346dfcc5ec21"} Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.045688 4886 generic.go:334] "Generic (PLEG): container finished" podID="92af746d-c60d-46a4-9be0-0ad28882ac0e" containerID="47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e" exitCode=0 Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.045715 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.045737 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" event={"ID":"92af746d-c60d-46a4-9be0-0ad28882ac0e","Type":"ContainerDied","Data":"47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e"} Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.046181 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg" event={"ID":"92af746d-c60d-46a4-9be0-0ad28882ac0e","Type":"ContainerDied","Data":"14141aff9fbd287a70454765b395ba76ef2991c8de80ea1c92111cb0e0c784c3"} Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.046209 4886 scope.go:117] "RemoveContainer" containerID="47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.067864 4886 scope.go:117] "RemoveContainer" containerID="47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e" Jan 29 16:28:44 crc kubenswrapper[4886]: E0129 16:28:44.068877 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e\": container with ID starting with 47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e not found: ID does not exist" containerID="47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.069124 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e"} err="failed to get container status \"47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e\": rpc error: code = NotFound desc = could not find container \"47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e\": container with ID starting with 47b4200b809c1086f4ae9fa69412cd5a201589369e8ff103458bcc2e4a47f38e not found: ID does not exist" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.086553 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.957244436 podStartE2EDuration="8.086536638s" podCreationTimestamp="2026-01-29 16:28:36 +0000 UTC" firstStartedPulling="2026-01-29 16:28:39.989445009 +0000 UTC m=+402.898164281" lastFinishedPulling="2026-01-29 16:28:43.118737211 +0000 UTC m=+406.027456483" observedRunningTime="2026-01-29 16:28:44.084405609 +0000 UTC m=+406.993124901" watchObservedRunningTime="2026-01-29 16:28:44.086536638 +0000 UTC m=+406.995255910" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.145991 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-dir\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.146076 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-serving-cert\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.146117 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-idp-0-file-data\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.146143 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-login\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.146399 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.147258 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-cliconfig\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.147314 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-policies\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.147853 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-869nb\" (UniqueName: \"kubernetes.io/projected/92af746d-c60d-46a4-9be0-0ad28882ac0e-kube-api-access-869nb\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148227 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-trusted-ca-bundle\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148272 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-provider-selection\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148398 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-error\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148477 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-router-certs\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148502 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-session\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148539 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-ocp-branding-template\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148658 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-service-ca\") pod \"92af746d-c60d-46a4-9be0-0ad28882ac0e\" (UID: \"92af746d-c60d-46a4-9be0-0ad28882ac0e\") " Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148941 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-service-ca\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.148985 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-login\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149032 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-router-certs\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149096 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149196 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9jk9\" (UniqueName: \"kubernetes.io/projected/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-kube-api-access-b9jk9\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149549 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149598 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-error\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149644 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149728 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149847 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149905 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-audit-policies\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.149963 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-audit-dir\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.150004 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.150113 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-session\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.150284 4886 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.147394 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.147780 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.150153 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.150679 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.151347 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.151729 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92af746d-c60d-46a4-9be0-0ad28882ac0e-kube-api-access-869nb" (OuterVolumeSpecName: "kube-api-access-869nb") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "kube-api-access-869nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.152556 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.153058 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.154227 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.154718 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.155381 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.156244 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.157592 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "92af746d-c60d-46a4-9be0-0ad28882ac0e" (UID: "92af746d-c60d-46a4-9be0-0ad28882ac0e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252000 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252096 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9jk9\" (UniqueName: \"kubernetes.io/projected/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-kube-api-access-b9jk9\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252163 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252202 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-error\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252243 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252292 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252377 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252420 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-audit-policies\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252463 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-audit-dir\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252504 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252540 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-session\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252608 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-service-ca\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252641 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-login\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252685 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-router-certs\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252765 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252791 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252812 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252833 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252852 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252871 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252889 4886 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252908 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-869nb\" (UniqueName: \"kubernetes.io/projected/92af746d-c60d-46a4-9be0-0ad28882ac0e-kube-api-access-869nb\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252927 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.252946 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.254936 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-audit-dir\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.255007 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.255068 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.255579 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-audit-policies\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.256249 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-service-ca\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.256300 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.256344 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.256359 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92af746d-c60d-46a4-9be0-0ad28882ac0e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.256724 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.261604 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-error\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.261744 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-session\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.270752 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.270814 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.271879 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-user-template-login\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.273062 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.277196 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9jk9\" (UniqueName: \"kubernetes.io/projected/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-kube-api-access-b9jk9\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.281937 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fdc5748-bb0c-435f-9cd3-9c093d647bf1-v4-0-config-system-router-certs\") pod \"oauth-openshift-c659c4677-kmlgq\" (UID: \"8fdc5748-bb0c-435f-9cd3-9c093d647bf1\") " pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.331973 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.391347 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg"] Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.395691 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-9fbfc7dc4-r9gqg"] Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.622879 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92af746d-c60d-46a4-9be0-0ad28882ac0e" path="/var/lib/kubelet/pods/92af746d-c60d-46a4-9be0-0ad28882ac0e/volumes" Jan 29 16:28:44 crc kubenswrapper[4886]: I0129 16:28:44.761091 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c659c4677-kmlgq"] Jan 29 16:28:44 crc kubenswrapper[4886]: W0129 16:28:44.764939 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fdc5748_bb0c_435f_9cd3_9c093d647bf1.slice/crio-ccf14c137f3b2319917ab4d4372a94bb3040e4782239b64f2619fbff882e721f WatchSource:0}: Error finding container ccf14c137f3b2319917ab4d4372a94bb3040e4782239b64f2619fbff882e721f: Status 404 returned error can't find the container with id ccf14c137f3b2319917ab4d4372a94bb3040e4782239b64f2619fbff882e721f Jan 29 16:28:45 crc kubenswrapper[4886]: I0129 16:28:45.058244 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" event={"ID":"8fdc5748-bb0c-435f-9cd3-9c093d647bf1","Type":"ContainerStarted","Data":"29e1aa2a3cc88075c47eedab0663e96cb963c626f255371a2afe9139afeb422e"} Jan 29 16:28:45 crc kubenswrapper[4886]: I0129 16:28:45.058769 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" event={"ID":"8fdc5748-bb0c-435f-9cd3-9c093d647bf1","Type":"ContainerStarted","Data":"ccf14c137f3b2319917ab4d4372a94bb3040e4782239b64f2619fbff882e721f"} Jan 29 16:28:45 crc kubenswrapper[4886]: I0129 16:28:45.080829 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" podStartSLOduration=27.08081433 podStartE2EDuration="27.08081433s" podCreationTimestamp="2026-01-29 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:28:45.077073546 +0000 UTC m=+407.985792818" watchObservedRunningTime="2026-01-29 16:28:45.08081433 +0000 UTC m=+407.989533602" Jan 29 16:28:45 crc kubenswrapper[4886]: I0129 16:28:45.746603 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:45 crc kubenswrapper[4886]: I0129 16:28:45.746665 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:45 crc kubenswrapper[4886]: I0129 16:28:45.748754 4886 patch_prober.go:28] interesting pod/console-54754b854f-fgkbk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.73:8443/health\": dial tcp 10.217.0.73:8443: connect: connection refused" start-of-body= Jan 29 16:28:45 crc kubenswrapper[4886]: I0129 16:28:45.748820 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-54754b854f-fgkbk" podUID="56fe8de1-76b0-42ad-9f62-53ac51eac78d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.73:8443/health\": dial tcp 10.217.0.73:8443: connect: connection refused" Jan 29 16:28:45 crc kubenswrapper[4886]: E0129 16:28:45.749364 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 16:28:45 crc kubenswrapper[4886]: E0129 16:28:45.749464 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jfv6k_openshift-marketplace(69003a39-1c09-4087-a494-ebfd69e973cf): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:45 crc kubenswrapper[4886]: E0129 16:28:45.751306 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:28:46 crc kubenswrapper[4886]: I0129 16:28:46.065342 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:46 crc kubenswrapper[4886]: I0129 16:28:46.072218 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-c659c4677-kmlgq" Jan 29 16:28:46 crc kubenswrapper[4886]: E0129 16:28:46.746362 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 16:28:46 crc kubenswrapper[4886]: E0129 16:28:46.746720 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn92n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zkk68_openshift-marketplace(d84ce3e9-c41a-4a08-8d86-2a918d5e9450): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:46 crc kubenswrapper[4886]: E0129 16:28:46.748185 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:28:47 crc kubenswrapper[4886]: I0129 16:28:47.313191 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:28:47 crc kubenswrapper[4886]: E0129 16:28:47.743157 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:28:47 crc kubenswrapper[4886]: E0129 16:28:47.743402 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf7sq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4qbl4_openshift-marketplace(57aa9115-b2d5-45aa-8ac3-e251c0907e45): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:28:47 crc kubenswrapper[4886]: E0129 16:28:47.744626 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:28:55 crc kubenswrapper[4886]: I0129 16:28:55.753550 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:55 crc kubenswrapper[4886]: I0129 16:28:55.766083 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:28:55 crc kubenswrapper[4886]: I0129 16:28:55.874508 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-frztl"] Jan 29 16:28:56 crc kubenswrapper[4886]: I0129 16:28:56.217368 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:56 crc kubenswrapper[4886]: I0129 16:28:56.217443 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:28:56 crc kubenswrapper[4886]: I0129 16:28:56.437619 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2gkn5" Jan 29 16:28:56 crc kubenswrapper[4886]: I0129 16:28:56.502803 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44l86"] Jan 29 16:28:57 crc kubenswrapper[4886]: E0129 16:28:57.616825 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:28:57 crc kubenswrapper[4886]: E0129 16:28:57.617123 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:28:58 crc kubenswrapper[4886]: E0129 16:28:58.628572 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:28:59 crc kubenswrapper[4886]: I0129 16:28:59.661043 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:28:59 crc kubenswrapper[4886]: I0129 16:28:59.661106 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:29:00 crc kubenswrapper[4886]: E0129 16:29:00.616382 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:29:08 crc kubenswrapper[4886]: E0129 16:29:08.623938 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:29:11 crc kubenswrapper[4886]: E0129 16:29:11.618365 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:29:12 crc kubenswrapper[4886]: E0129 16:29:12.617016 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:29:14 crc kubenswrapper[4886]: E0129 16:29:14.619923 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:29:16 crc kubenswrapper[4886]: I0129 16:29:16.222480 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:29:16 crc kubenswrapper[4886]: I0129 16:29:16.236881 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-75f86dc845-cd7l9" Jan 29 16:29:20 crc kubenswrapper[4886]: I0129 16:29:20.919526 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-frztl" podUID="ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" containerName="console" containerID="cri-o://1b0d59f7a0b0f2503aadbe69a4ed4abbcb0da9a1640279030e487d1ecaa3fce8" gracePeriod=15 Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.293279 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-frztl_ffb1a6d7-9220-473e-9fcd-8d91d590f3a5/console/0.log" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.293715 4886 generic.go:334] "Generic (PLEG): container finished" podID="ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" containerID="1b0d59f7a0b0f2503aadbe69a4ed4abbcb0da9a1640279030e487d1ecaa3fce8" exitCode=2 Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.293754 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-frztl" event={"ID":"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5","Type":"ContainerDied","Data":"1b0d59f7a0b0f2503aadbe69a4ed4abbcb0da9a1640279030e487d1ecaa3fce8"} Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.420269 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-frztl_ffb1a6d7-9220-473e-9fcd-8d91d590f3a5/console/0.log" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.420391 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-frztl" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.551614 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-service-ca\") pod \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\" (UID: \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\") " Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.551684 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-trusted-ca-bundle\") pod \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\" (UID: \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\") " Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.551745 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-oauth-serving-cert\") pod \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\" (UID: \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\") " Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.551782 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-oauth-config\") pod \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\" (UID: \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\") " Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.551802 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-serving-cert\") pod \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\" (UID: \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\") " Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.551824 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-config\") pod \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\" (UID: \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\") " Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.551860 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhgl2\" (UniqueName: \"kubernetes.io/projected/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-kube-api-access-zhgl2\") pod \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\" (UID: \"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5\") " Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.553080 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-service-ca" (OuterVolumeSpecName: "service-ca") pod "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" (UID: "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.553205 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" (UID: "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.553312 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-config" (OuterVolumeSpecName: "console-config") pod "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" (UID: "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.553306 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" (UID: "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.554065 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-44l86" podUID="b1d6caa5-f77a-4acf-a631-0c3abb84959c" containerName="registry" containerID="cri-o://deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f" gracePeriod=30 Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.558164 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" (UID: "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.558220 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-kube-api-access-zhgl2" (OuterVolumeSpecName: "kube-api-access-zhgl2") pod "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" (UID: "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5"). InnerVolumeSpecName "kube-api-access-zhgl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.559046 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" (UID: "ffb1a6d7-9220-473e-9fcd-8d91d590f3a5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.653235 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhgl2\" (UniqueName: \"kubernetes.io/projected/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-kube-api-access-zhgl2\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.653274 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.653284 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.653292 4886 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.653301 4886 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.653309 4886 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:21 crc kubenswrapper[4886]: I0129 16:29:21.653319 4886 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5-console-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:22 crc kubenswrapper[4886]: I0129 16:29:22.303415 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-frztl_ffb1a6d7-9220-473e-9fcd-8d91d590f3a5/console/0.log" Jan 29 16:29:22 crc kubenswrapper[4886]: I0129 16:29:22.303700 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-frztl" event={"ID":"ffb1a6d7-9220-473e-9fcd-8d91d590f3a5","Type":"ContainerDied","Data":"f5f1eb8dc3efdd72b68491a7af9fe6df247f17abe7404590089aab88c87a64e1"} Jan 29 16:29:22 crc kubenswrapper[4886]: I0129 16:29:22.303753 4886 scope.go:117] "RemoveContainer" containerID="1b0d59f7a0b0f2503aadbe69a4ed4abbcb0da9a1640279030e487d1ecaa3fce8" Jan 29 16:29:22 crc kubenswrapper[4886]: I0129 16:29:22.303791 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-frztl" Jan 29 16:29:22 crc kubenswrapper[4886]: I0129 16:29:22.338089 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-frztl"] Jan 29 16:29:22 crc kubenswrapper[4886]: I0129 16:29:22.343283 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-frztl"] Jan 29 16:29:22 crc kubenswrapper[4886]: E0129 16:29:22.617907 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:29:22 crc kubenswrapper[4886]: I0129 16:29:22.624860 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" path="/var/lib/kubelet/pods/ffb1a6d7-9220-473e-9fcd-8d91d590f3a5/volumes" Jan 29 16:29:22 crc kubenswrapper[4886]: I0129 16:29:22.961088 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-44l86" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.076034 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1d6caa5-f77a-4acf-a631-0c3abb84959c-registry-certificates\") pod \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\" (UID: \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\") " Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.076395 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-bound-sa-token\") pod \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\" (UID: \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\") " Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.076504 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\" (UID: \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\") " Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.076543 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1d6caa5-f77a-4acf-a631-0c3abb84959c-trusted-ca\") pod \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\" (UID: \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\") " Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.076583 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1d6caa5-f77a-4acf-a631-0c3abb84959c-installation-pull-secrets\") pod \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\" (UID: \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\") " Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.076658 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-registry-tls\") pod \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\" (UID: \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\") " Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.076696 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1d6caa5-f77a-4acf-a631-0c3abb84959c-ca-trust-extracted\") pod \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\" (UID: \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\") " Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.076736 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vgbh\" (UniqueName: \"kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-kube-api-access-8vgbh\") pod \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\" (UID: \"b1d6caa5-f77a-4acf-a631-0c3abb84959c\") " Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.077190 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6caa5-f77a-4acf-a631-0c3abb84959c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b1d6caa5-f77a-4acf-a631-0c3abb84959c" (UID: "b1d6caa5-f77a-4acf-a631-0c3abb84959c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.077383 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6caa5-f77a-4acf-a631-0c3abb84959c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b1d6caa5-f77a-4acf-a631-0c3abb84959c" (UID: "b1d6caa5-f77a-4acf-a631-0c3abb84959c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.082898 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1d6caa5-f77a-4acf-a631-0c3abb84959c-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.082919 4886 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1d6caa5-f77a-4acf-a631-0c3abb84959c-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.083994 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d6caa5-f77a-4acf-a631-0c3abb84959c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b1d6caa5-f77a-4acf-a631-0c3abb84959c" (UID: "b1d6caa5-f77a-4acf-a631-0c3abb84959c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.084215 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-kube-api-access-8vgbh" (OuterVolumeSpecName: "kube-api-access-8vgbh") pod "b1d6caa5-f77a-4acf-a631-0c3abb84959c" (UID: "b1d6caa5-f77a-4acf-a631-0c3abb84959c"). InnerVolumeSpecName "kube-api-access-8vgbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.084549 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b1d6caa5-f77a-4acf-a631-0c3abb84959c" (UID: "b1d6caa5-f77a-4acf-a631-0c3abb84959c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.085944 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b1d6caa5-f77a-4acf-a631-0c3abb84959c" (UID: "b1d6caa5-f77a-4acf-a631-0c3abb84959c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.090105 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b1d6caa5-f77a-4acf-a631-0c3abb84959c" (UID: "b1d6caa5-f77a-4acf-a631-0c3abb84959c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.101539 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d6caa5-f77a-4acf-a631-0c3abb84959c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b1d6caa5-f77a-4acf-a631-0c3abb84959c" (UID: "b1d6caa5-f77a-4acf-a631-0c3abb84959c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.183874 4886 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.183915 4886 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1d6caa5-f77a-4acf-a631-0c3abb84959c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.183964 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vgbh\" (UniqueName: \"kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-kube-api-access-8vgbh\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.183978 4886 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1d6caa5-f77a-4acf-a631-0c3abb84959c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.183989 4886 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1d6caa5-f77a-4acf-a631-0c3abb84959c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.318569 4886 generic.go:334] "Generic (PLEG): container finished" podID="b1d6caa5-f77a-4acf-a631-0c3abb84959c" containerID="deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f" exitCode=0 Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.318609 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-44l86" event={"ID":"b1d6caa5-f77a-4acf-a631-0c3abb84959c","Type":"ContainerDied","Data":"deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f"} Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.318594 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-44l86" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.318644 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-44l86" event={"ID":"b1d6caa5-f77a-4acf-a631-0c3abb84959c","Type":"ContainerDied","Data":"a00a9bdfeb0d8ca50bb13348e56690ba099ee336a61298251b903a6dea3d27eb"} Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.318663 4886 scope.go:117] "RemoveContainer" containerID="deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.353050 4886 scope.go:117] "RemoveContainer" containerID="deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f" Jan 29 16:29:23 crc kubenswrapper[4886]: E0129 16:29:23.353789 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f\": container with ID starting with deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f not found: ID does not exist" containerID="deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.353841 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f"} err="failed to get container status \"deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f\": rpc error: code = NotFound desc = could not find container \"deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f\": container with ID starting with deed27046f024e80d24dc9a6d74e2361911272418a25dac03f3d34ed2d07513f not found: ID does not exist" Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.354378 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44l86"] Jan 29 16:29:23 crc kubenswrapper[4886]: I0129 16:29:23.359146 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44l86"] Jan 29 16:29:24 crc kubenswrapper[4886]: I0129 16:29:24.629663 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d6caa5-f77a-4acf-a631-0c3abb84959c" path="/var/lib/kubelet/pods/b1d6caa5-f77a-4acf-a631-0c3abb84959c/volumes" Jan 29 16:29:25 crc kubenswrapper[4886]: E0129 16:29:25.618197 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:29:26 crc kubenswrapper[4886]: E0129 16:29:26.618661 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:29:26 crc kubenswrapper[4886]: E0129 16:29:26.749606 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 16:29:26 crc kubenswrapper[4886]: E0129 16:29:26.750104 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jfv6k_openshift-marketplace(69003a39-1c09-4087-a494-ebfd69e973cf): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:29:26 crc kubenswrapper[4886]: E0129 16:29:26.751967 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:29:29 crc kubenswrapper[4886]: I0129 16:29:29.661231 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:29:29 crc kubenswrapper[4886]: I0129 16:29:29.661357 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:29:29 crc kubenswrapper[4886]: I0129 16:29:29.661427 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:29:29 crc kubenswrapper[4886]: I0129 16:29:29.662369 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96fb4b3b0684eec0f8e815c984345d77640459634c9d28cbf8434505ebf34891"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:29:29 crc kubenswrapper[4886]: I0129 16:29:29.662478 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://96fb4b3b0684eec0f8e815c984345d77640459634c9d28cbf8434505ebf34891" gracePeriod=600 Jan 29 16:29:30 crc kubenswrapper[4886]: I0129 16:29:30.384008 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="96fb4b3b0684eec0f8e815c984345d77640459634c9d28cbf8434505ebf34891" exitCode=0 Jan 29 16:29:30 crc kubenswrapper[4886]: I0129 16:29:30.384105 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"96fb4b3b0684eec0f8e815c984345d77640459634c9d28cbf8434505ebf34891"} Jan 29 16:29:30 crc kubenswrapper[4886]: I0129 16:29:30.384997 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"ae7876e7e5e026deccf52515d738eb4b775938bb13eef71ab45573508b57aaa0"} Jan 29 16:29:30 crc kubenswrapper[4886]: I0129 16:29:30.385035 4886 scope.go:117] "RemoveContainer" containerID="8055fe73a1cd8fb346a9937fb9960eb4b8cf16950f5ed88b206f4a30871b1028" Jan 29 16:29:35 crc kubenswrapper[4886]: E0129 16:29:35.739670 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 16:29:35 crc kubenswrapper[4886]: E0129 16:29:35.740296 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8jsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q5hs7_openshift-marketplace(a7325ad0-28bf-45e0-bbd5-160f441de091): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:29:35 crc kubenswrapper[4886]: E0129 16:29:35.741745 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:29:37 crc kubenswrapper[4886]: I0129 16:29:37.313107 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:29:37 crc kubenswrapper[4886]: I0129 16:29:37.345577 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:29:37 crc kubenswrapper[4886]: I0129 16:29:37.452106 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Jan 29 16:29:37 crc kubenswrapper[4886]: E0129 16:29:37.735977 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 16:29:37 crc kubenswrapper[4886]: E0129 16:29:37.736340 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn92n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zkk68_openshift-marketplace(d84ce3e9-c41a-4a08-8d86-2a918d5e9450): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:29:37 crc kubenswrapper[4886]: E0129 16:29:37.737549 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:29:38 crc kubenswrapper[4886]: E0129 16:29:38.749531 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:29:38 crc kubenswrapper[4886]: E0129 16:29:38.749741 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf7sq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4qbl4_openshift-marketplace(57aa9115-b2d5-45aa-8ac3-e251c0907e45): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:29:38 crc kubenswrapper[4886]: E0129 16:29:38.751006 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:29:40 crc kubenswrapper[4886]: E0129 16:29:40.618513 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:29:46 crc kubenswrapper[4886]: E0129 16:29:46.620755 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.051011 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-664586d6fb-g55cf"] Jan 29 16:29:50 crc kubenswrapper[4886]: E0129 16:29:50.051922 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6caa5-f77a-4acf-a631-0c3abb84959c" containerName="registry" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.051943 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6caa5-f77a-4acf-a631-0c3abb84959c" containerName="registry" Jan 29 16:29:50 crc kubenswrapper[4886]: E0129 16:29:50.051969 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" containerName="console" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.051981 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" containerName="console" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.052168 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb1a6d7-9220-473e-9fcd-8d91d590f3a5" containerName="console" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.052201 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d6caa5-f77a-4acf-a631-0c3abb84959c" containerName="registry" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.052903 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.069994 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-664586d6fb-g55cf"] Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.211762 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-console-config\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.212158 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-serving-cert\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.212196 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-oauth-config\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.212287 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-oauth-serving-cert\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.212466 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-service-ca\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.212627 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-trusted-ca-bundle\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.212780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln452\" (UniqueName: \"kubernetes.io/projected/42357e7c-de03-4b8b-80f5-f946411c67f7-kube-api-access-ln452\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.314512 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-serving-cert\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.314580 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-oauth-config\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.314675 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-oauth-serving-cert\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.314725 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-service-ca\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.314772 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-trusted-ca-bundle\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.314831 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln452\" (UniqueName: \"kubernetes.io/projected/42357e7c-de03-4b8b-80f5-f946411c67f7-kube-api-access-ln452\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.314883 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-console-config\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.315547 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-service-ca\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.316576 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-console-config\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.316815 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-oauth-serving-cert\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.317556 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-trusted-ca-bundle\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.323276 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-serving-cert\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.330779 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-oauth-config\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.342431 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln452\" (UniqueName: \"kubernetes.io/projected/42357e7c-de03-4b8b-80f5-f946411c67f7-kube-api-access-ln452\") pod \"console-664586d6fb-g55cf\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.375567 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:29:50 crc kubenswrapper[4886]: I0129 16:29:50.630058 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-664586d6fb-g55cf"] Jan 29 16:29:51 crc kubenswrapper[4886]: I0129 16:29:51.527068 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664586d6fb-g55cf" event={"ID":"42357e7c-de03-4b8b-80f5-f946411c67f7","Type":"ContainerStarted","Data":"6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38"} Jan 29 16:29:51 crc kubenswrapper[4886]: I0129 16:29:51.527456 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664586d6fb-g55cf" event={"ID":"42357e7c-de03-4b8b-80f5-f946411c67f7","Type":"ContainerStarted","Data":"4c6fe087595c24e70608f508c9599d4ead9e60d5c503746f12585384b13bc295"} Jan 29 16:29:51 crc kubenswrapper[4886]: I0129 16:29:51.549462 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-664586d6fb-g55cf" podStartSLOduration=1.5494434419999998 podStartE2EDuration="1.549443442s" podCreationTimestamp="2026-01-29 16:29:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:29:51.54526108 +0000 UTC m=+474.453980372" watchObservedRunningTime="2026-01-29 16:29:51.549443442 +0000 UTC m=+474.458162724" Jan 29 16:29:51 crc kubenswrapper[4886]: E0129 16:29:51.617317 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:29:51 crc kubenswrapper[4886]: E0129 16:29:51.617706 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:29:55 crc kubenswrapper[4886]: E0129 16:29:55.619399 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:29:59 crc kubenswrapper[4886]: E0129 16:29:59.617132 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.176958 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9"] Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.177928 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.182262 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.182391 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.187987 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9"] Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.289160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqmrl\" (UniqueName: \"kubernetes.io/projected/18290a86-b94a-42c5-9f50-1614077f881b-kube-api-access-cqmrl\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.289244 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18290a86-b94a-42c5-9f50-1614077f881b-secret-volume\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.289425 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18290a86-b94a-42c5-9f50-1614077f881b-config-volume\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.376365 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.376415 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.382173 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.390392 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18290a86-b94a-42c5-9f50-1614077f881b-secret-volume\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.390446 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18290a86-b94a-42c5-9f50-1614077f881b-config-volume\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.390518 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqmrl\" (UniqueName: \"kubernetes.io/projected/18290a86-b94a-42c5-9f50-1614077f881b-kube-api-access-cqmrl\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.391794 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18290a86-b94a-42c5-9f50-1614077f881b-config-volume\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.398287 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18290a86-b94a-42c5-9f50-1614077f881b-secret-volume\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.406809 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqmrl\" (UniqueName: \"kubernetes.io/projected/18290a86-b94a-42c5-9f50-1614077f881b-kube-api-access-cqmrl\") pod \"collect-profiles-29495070-xnbx9\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.502071 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.600287 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.675529 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54754b854f-fgkbk"] Jan 29 16:30:00 crc kubenswrapper[4886]: I0129 16:30:00.781902 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9"] Jan 29 16:30:01 crc kubenswrapper[4886]: I0129 16:30:01.602034 4886 generic.go:334] "Generic (PLEG): container finished" podID="18290a86-b94a-42c5-9f50-1614077f881b" containerID="5f38a23b3e231c3670461bd30eb72fab48714dac00ff0dbd8042edb99ce295c4" exitCode=0 Jan 29 16:30:01 crc kubenswrapper[4886]: I0129 16:30:01.602417 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" event={"ID":"18290a86-b94a-42c5-9f50-1614077f881b","Type":"ContainerDied","Data":"5f38a23b3e231c3670461bd30eb72fab48714dac00ff0dbd8042edb99ce295c4"} Jan 29 16:30:01 crc kubenswrapper[4886]: I0129 16:30:01.602676 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" event={"ID":"18290a86-b94a-42c5-9f50-1614077f881b","Type":"ContainerStarted","Data":"71c1d5e9632004d3ae72c5f6e641a2523e7dd35669f7e1827e51af004d1e1ae3"} Jan 29 16:30:02 crc kubenswrapper[4886]: I0129 16:30:02.838203 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:02 crc kubenswrapper[4886]: I0129 16:30:02.930487 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqmrl\" (UniqueName: \"kubernetes.io/projected/18290a86-b94a-42c5-9f50-1614077f881b-kube-api-access-cqmrl\") pod \"18290a86-b94a-42c5-9f50-1614077f881b\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " Jan 29 16:30:02 crc kubenswrapper[4886]: I0129 16:30:02.930640 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18290a86-b94a-42c5-9f50-1614077f881b-config-volume\") pod \"18290a86-b94a-42c5-9f50-1614077f881b\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " Jan 29 16:30:02 crc kubenswrapper[4886]: I0129 16:30:02.930671 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18290a86-b94a-42c5-9f50-1614077f881b-secret-volume\") pod \"18290a86-b94a-42c5-9f50-1614077f881b\" (UID: \"18290a86-b94a-42c5-9f50-1614077f881b\") " Jan 29 16:30:02 crc kubenswrapper[4886]: I0129 16:30:02.931531 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18290a86-b94a-42c5-9f50-1614077f881b-config-volume" (OuterVolumeSpecName: "config-volume") pod "18290a86-b94a-42c5-9f50-1614077f881b" (UID: "18290a86-b94a-42c5-9f50-1614077f881b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:30:02 crc kubenswrapper[4886]: I0129 16:30:02.936168 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18290a86-b94a-42c5-9f50-1614077f881b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "18290a86-b94a-42c5-9f50-1614077f881b" (UID: "18290a86-b94a-42c5-9f50-1614077f881b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:30:02 crc kubenswrapper[4886]: I0129 16:30:02.936775 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18290a86-b94a-42c5-9f50-1614077f881b-kube-api-access-cqmrl" (OuterVolumeSpecName: "kube-api-access-cqmrl") pod "18290a86-b94a-42c5-9f50-1614077f881b" (UID: "18290a86-b94a-42c5-9f50-1614077f881b"). InnerVolumeSpecName "kube-api-access-cqmrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:30:03 crc kubenswrapper[4886]: I0129 16:30:03.031728 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18290a86-b94a-42c5-9f50-1614077f881b-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:03 crc kubenswrapper[4886]: I0129 16:30:03.031759 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18290a86-b94a-42c5-9f50-1614077f881b-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:03 crc kubenswrapper[4886]: I0129 16:30:03.031770 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqmrl\" (UniqueName: \"kubernetes.io/projected/18290a86-b94a-42c5-9f50-1614077f881b-kube-api-access-cqmrl\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:03 crc kubenswrapper[4886]: I0129 16:30:03.613281 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" event={"ID":"18290a86-b94a-42c5-9f50-1614077f881b","Type":"ContainerDied","Data":"71c1d5e9632004d3ae72c5f6e641a2523e7dd35669f7e1827e51af004d1e1ae3"} Jan 29 16:30:03 crc kubenswrapper[4886]: I0129 16:30:03.613698 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71c1d5e9632004d3ae72c5f6e641a2523e7dd35669f7e1827e51af004d1e1ae3" Jan 29 16:30:03 crc kubenswrapper[4886]: I0129 16:30:03.613410 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9" Jan 29 16:30:04 crc kubenswrapper[4886]: E0129 16:30:04.616916 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:30:04 crc kubenswrapper[4886]: E0129 16:30:04.616950 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:30:08 crc kubenswrapper[4886]: E0129 16:30:08.624136 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:30:12 crc kubenswrapper[4886]: E0129 16:30:12.617203 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:30:16 crc kubenswrapper[4886]: E0129 16:30:16.617957 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:30:17 crc kubenswrapper[4886]: E0129 16:30:17.617247 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:30:21 crc kubenswrapper[4886]: E0129 16:30:21.616363 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:30:24 crc kubenswrapper[4886]: E0129 16:30:24.618006 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:30:25 crc kubenswrapper[4886]: I0129 16:30:25.730829 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-54754b854f-fgkbk" podUID="56fe8de1-76b0-42ad-9f62-53ac51eac78d" containerName="console" containerID="cri-o://912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790" gracePeriod=15 Jan 29 16:30:25 crc kubenswrapper[4886]: I0129 16:30:25.746802 4886 patch_prober.go:28] interesting pod/console-54754b854f-fgkbk container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/health\": dial tcp 10.217.0.73:8443: connect: connection refused" start-of-body= Jan 29 16:30:25 crc kubenswrapper[4886]: I0129 16:30:25.746857 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-54754b854f-fgkbk" podUID="56fe8de1-76b0-42ad-9f62-53ac51eac78d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.73:8443/health\": dial tcp 10.217.0.73:8443: connect: connection refused" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.159475 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54754b854f-fgkbk_56fe8de1-76b0-42ad-9f62-53ac51eac78d/console/0.log" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.159813 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.294544 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-service-ca\") pod \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.294597 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-trusted-ca-bundle\") pod \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.294712 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-config\") pod \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.294747 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-serving-cert\") pod \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.294802 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-oauth-serving-cert\") pod \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.294839 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-oauth-config\") pod \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.294882 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqdgg\" (UniqueName: \"kubernetes.io/projected/56fe8de1-76b0-42ad-9f62-53ac51eac78d-kube-api-access-hqdgg\") pod \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\" (UID: \"56fe8de1-76b0-42ad-9f62-53ac51eac78d\") " Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.295450 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "56fe8de1-76b0-42ad-9f62-53ac51eac78d" (UID: "56fe8de1-76b0-42ad-9f62-53ac51eac78d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.295588 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-service-ca" (OuterVolumeSpecName: "service-ca") pod "56fe8de1-76b0-42ad-9f62-53ac51eac78d" (UID: "56fe8de1-76b0-42ad-9f62-53ac51eac78d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.296026 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-config" (OuterVolumeSpecName: "console-config") pod "56fe8de1-76b0-42ad-9f62-53ac51eac78d" (UID: "56fe8de1-76b0-42ad-9f62-53ac51eac78d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.296215 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "56fe8de1-76b0-42ad-9f62-53ac51eac78d" (UID: "56fe8de1-76b0-42ad-9f62-53ac51eac78d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.299506 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "56fe8de1-76b0-42ad-9f62-53ac51eac78d" (UID: "56fe8de1-76b0-42ad-9f62-53ac51eac78d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.299851 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "56fe8de1-76b0-42ad-9f62-53ac51eac78d" (UID: "56fe8de1-76b0-42ad-9f62-53ac51eac78d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.300632 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56fe8de1-76b0-42ad-9f62-53ac51eac78d-kube-api-access-hqdgg" (OuterVolumeSpecName: "kube-api-access-hqdgg") pod "56fe8de1-76b0-42ad-9f62-53ac51eac78d" (UID: "56fe8de1-76b0-42ad-9f62-53ac51eac78d"). InnerVolumeSpecName "kube-api-access-hqdgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.396595 4886 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.396828 4886 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.396886 4886 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.396934 4886 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56fe8de1-76b0-42ad-9f62-53ac51eac78d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.396982 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqdgg\" (UniqueName: \"kubernetes.io/projected/56fe8de1-76b0-42ad-9f62-53ac51eac78d-kube-api-access-hqdgg\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.397039 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.397087 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56fe8de1-76b0-42ad-9f62-53ac51eac78d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.758426 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54754b854f-fgkbk_56fe8de1-76b0-42ad-9f62-53ac51eac78d/console/0.log" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.758475 4886 generic.go:334] "Generic (PLEG): container finished" podID="56fe8de1-76b0-42ad-9f62-53ac51eac78d" containerID="912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790" exitCode=2 Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.758511 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54754b854f-fgkbk" event={"ID":"56fe8de1-76b0-42ad-9f62-53ac51eac78d","Type":"ContainerDied","Data":"912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790"} Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.758543 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54754b854f-fgkbk" event={"ID":"56fe8de1-76b0-42ad-9f62-53ac51eac78d","Type":"ContainerDied","Data":"92457371ca67ffbaa6957a21cf77005c4601275089a8ad1b5d44bb6186c2a4ce"} Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.758571 4886 scope.go:117] "RemoveContainer" containerID="912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.758691 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54754b854f-fgkbk" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.780175 4886 scope.go:117] "RemoveContainer" containerID="912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790" Jan 29 16:30:26 crc kubenswrapper[4886]: E0129 16:30:26.780904 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790\": container with ID starting with 912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790 not found: ID does not exist" containerID="912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.780973 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790"} err="failed to get container status \"912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790\": rpc error: code = NotFound desc = could not find container \"912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790\": container with ID starting with 912b8ca8f57d0bc2a261b229c7ccc6eafc982f004db336b3f33746c6d8c5a790 not found: ID does not exist" Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.782909 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54754b854f-fgkbk"] Jan 29 16:30:26 crc kubenswrapper[4886]: I0129 16:30:26.793137 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54754b854f-fgkbk"] Jan 29 16:30:28 crc kubenswrapper[4886]: I0129 16:30:28.623516 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56fe8de1-76b0-42ad-9f62-53ac51eac78d" path="/var/lib/kubelet/pods/56fe8de1-76b0-42ad-9f62-53ac51eac78d/volumes" Jan 29 16:30:29 crc kubenswrapper[4886]: E0129 16:30:29.617530 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:30:30 crc kubenswrapper[4886]: E0129 16:30:30.617504 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:30:34 crc kubenswrapper[4886]: E0129 16:30:34.618446 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:30:38 crc kubenswrapper[4886]: E0129 16:30:38.621893 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:30:43 crc kubenswrapper[4886]: E0129 16:30:43.618057 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:30:45 crc kubenswrapper[4886]: E0129 16:30:45.616804 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:30:49 crc kubenswrapper[4886]: I0129 16:30:49.618960 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 16:30:49 crc kubenswrapper[4886]: E0129 16:30:49.750147 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 16:30:49 crc kubenswrapper[4886]: E0129 16:30:49.750310 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jfv6k_openshift-marketplace(69003a39-1c09-4087-a494-ebfd69e973cf): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:30:49 crc kubenswrapper[4886]: E0129 16:30:49.751495 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:30:52 crc kubenswrapper[4886]: E0129 16:30:52.616511 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:30:55 crc kubenswrapper[4886]: E0129 16:30:55.617896 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:31:00 crc kubenswrapper[4886]: E0129 16:31:00.766961 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:31:00 crc kubenswrapper[4886]: E0129 16:31:00.767679 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf7sq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4qbl4_openshift-marketplace(57aa9115-b2d5-45aa-8ac3-e251c0907e45): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:31:00 crc kubenswrapper[4886]: E0129 16:31:00.769029 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:31:01 crc kubenswrapper[4886]: E0129 16:31:01.616942 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:31:07 crc kubenswrapper[4886]: E0129 16:31:07.746402 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 16:31:07 crc kubenswrapper[4886]: E0129 16:31:07.747546 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8jsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q5hs7_openshift-marketplace(a7325ad0-28bf-45e0-bbd5-160f441de091): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:31:07 crc kubenswrapper[4886]: E0129 16:31:07.748883 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:31:08 crc kubenswrapper[4886]: E0129 16:31:08.812692 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 16:31:08 crc kubenswrapper[4886]: E0129 16:31:08.812902 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn92n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zkk68_openshift-marketplace(d84ce3e9-c41a-4a08-8d86-2a918d5e9450): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:31:08 crc kubenswrapper[4886]: E0129 16:31:08.814902 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:31:13 crc kubenswrapper[4886]: E0129 16:31:13.616117 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:31:14 crc kubenswrapper[4886]: E0129 16:31:14.617026 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:31:19 crc kubenswrapper[4886]: E0129 16:31:19.617502 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:31:20 crc kubenswrapper[4886]: E0129 16:31:20.618893 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:31:25 crc kubenswrapper[4886]: E0129 16:31:25.617271 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:31:26 crc kubenswrapper[4886]: E0129 16:31:26.617945 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:31:29 crc kubenswrapper[4886]: I0129 16:31:29.660440 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:31:29 crc kubenswrapper[4886]: I0129 16:31:29.660785 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:31:30 crc kubenswrapper[4886]: E0129 16:31:30.616936 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:31:33 crc kubenswrapper[4886]: E0129 16:31:33.617033 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:31:38 crc kubenswrapper[4886]: E0129 16:31:38.623349 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:31:41 crc kubenswrapper[4886]: E0129 16:31:41.617413 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:31:45 crc kubenswrapper[4886]: E0129 16:31:45.617685 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:31:48 crc kubenswrapper[4886]: E0129 16:31:48.623428 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:31:52 crc kubenswrapper[4886]: E0129 16:31:52.617582 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:31:56 crc kubenswrapper[4886]: E0129 16:31:56.616566 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:31:59 crc kubenswrapper[4886]: I0129 16:31:59.661730 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:31:59 crc kubenswrapper[4886]: I0129 16:31:59.662161 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:32:00 crc kubenswrapper[4886]: E0129 16:32:00.616590 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:32:01 crc kubenswrapper[4886]: E0129 16:32:01.616181 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:32:05 crc kubenswrapper[4886]: E0129 16:32:05.616627 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:32:07 crc kubenswrapper[4886]: E0129 16:32:07.617425 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:32:12 crc kubenswrapper[4886]: E0129 16:32:12.617095 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:32:13 crc kubenswrapper[4886]: E0129 16:32:13.616069 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:32:17 crc kubenswrapper[4886]: E0129 16:32:17.616892 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:32:20 crc kubenswrapper[4886]: E0129 16:32:20.618010 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:32:27 crc kubenswrapper[4886]: E0129 16:32:27.618397 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:32:27 crc kubenswrapper[4886]: E0129 16:32:27.619307 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:32:29 crc kubenswrapper[4886]: I0129 16:32:29.661616 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:32:29 crc kubenswrapper[4886]: I0129 16:32:29.661706 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:32:29 crc kubenswrapper[4886]: I0129 16:32:29.661771 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:32:29 crc kubenswrapper[4886]: I0129 16:32:29.662847 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae7876e7e5e026deccf52515d738eb4b775938bb13eef71ab45573508b57aaa0"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:32:29 crc kubenswrapper[4886]: I0129 16:32:29.662959 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://ae7876e7e5e026deccf52515d738eb4b775938bb13eef71ab45573508b57aaa0" gracePeriod=600 Jan 29 16:32:30 crc kubenswrapper[4886]: I0129 16:32:30.612594 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="ae7876e7e5e026deccf52515d738eb4b775938bb13eef71ab45573508b57aaa0" exitCode=0 Jan 29 16:32:30 crc kubenswrapper[4886]: I0129 16:32:30.612691 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"ae7876e7e5e026deccf52515d738eb4b775938bb13eef71ab45573508b57aaa0"} Jan 29 16:32:30 crc kubenswrapper[4886]: I0129 16:32:30.613313 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"773fe28c1c2f4b4e6b5a35ea611b7d8ab8f392d8f1b68bb09ec93e5c483b53ed"} Jan 29 16:32:30 crc kubenswrapper[4886]: I0129 16:32:30.613370 4886 scope.go:117] "RemoveContainer" containerID="96fb4b3b0684eec0f8e815c984345d77640459634c9d28cbf8434505ebf34891" Jan 29 16:32:30 crc kubenswrapper[4886]: E0129 16:32:30.617894 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:32:31 crc kubenswrapper[4886]: E0129 16:32:31.617535 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:32:38 crc kubenswrapper[4886]: E0129 16:32:38.619939 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:32:40 crc kubenswrapper[4886]: E0129 16:32:40.616641 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:32:44 crc kubenswrapper[4886]: E0129 16:32:44.618014 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:32:46 crc kubenswrapper[4886]: E0129 16:32:46.617555 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:32:49 crc kubenswrapper[4886]: E0129 16:32:49.616440 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:32:53 crc kubenswrapper[4886]: E0129 16:32:53.618240 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:32:56 crc kubenswrapper[4886]: E0129 16:32:56.618104 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:32:57 crc kubenswrapper[4886]: E0129 16:32:57.619131 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:33:01 crc kubenswrapper[4886]: E0129 16:33:01.617984 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:33:08 crc kubenswrapper[4886]: E0129 16:33:08.623440 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:33:09 crc kubenswrapper[4886]: E0129 16:33:09.616008 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:33:09 crc kubenswrapper[4886]: E0129 16:33:09.616274 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:33:14 crc kubenswrapper[4886]: E0129 16:33:14.619411 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.621139 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bsnwn"] Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.622184 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="nbdb" containerID="cri-o://aff586e7c8306a470164e6d1603b7a84b79e22ff53f7871cff535736f72f77b8" gracePeriod=30 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.622239 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://db747d554077a641bca85a4b376af5cc3ebe9e9addb59303e40961567d28422a" gracePeriod=30 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.622386 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="northd" containerID="cri-o://1103e45d1299bd7cc9890cc70e1b35be3c7e5cdc36cdc23191cb32c65b6851af" gracePeriod=30 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.622312 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovn-acl-logging" containerID="cri-o://54fecd80df24f20c923283f6966a565b8cf9cee51d2194836164df5fc69600b8" gracePeriod=30 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.622569 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="sbdb" containerID="cri-o://38f5a9a3458a900401d93f99197abc69e3baaf3038a89e74d142344fbf0d9ff5" gracePeriod=30 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.622587 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kube-rbac-proxy-node" containerID="cri-o://34083f87301d604fb38ce6765e0d429895295ab0c89f02abfc1cfde1d71f4454" gracePeriod=30 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.622676 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovn-controller" containerID="cri-o://b912acee2b3fec4fd1d0704a94a867e79b9191286159220760027325f0709c51" gracePeriod=30 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.671645 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" containerID="cri-o://f3e810b92c533dbff0b37232e3b59d6146e02214a9506edd851862a6737312a5" gracePeriod=30 Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.674561 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aff586e7c8306a470164e6d1603b7a84b79e22ff53f7871cff535736f72f77b8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.674949 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f5a9a3458a900401d93f99197abc69e3baaf3038a89e74d142344fbf0d9ff5" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.677570 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f5a9a3458a900401d93f99197abc69e3baaf3038a89e74d142344fbf0d9ff5" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.677687 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aff586e7c8306a470164e6d1603b7a84b79e22ff53f7871cff535736f72f77b8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.679596 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aff586e7c8306a470164e6d1603b7a84b79e22ff53f7871cff535736f72f77b8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.679657 4886 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="nbdb" Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.680282 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f5a9a3458a900401d93f99197abc69e3baaf3038a89e74d142344fbf0d9ff5" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.680359 4886 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="sbdb" Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.981450 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovnkube-controller/3.log" Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.984722 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovn-acl-logging/0.log" Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985234 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovn-controller/0.log" Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985615 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerID="f3e810b92c533dbff0b37232e3b59d6146e02214a9506edd851862a6737312a5" exitCode=0 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985641 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerID="38f5a9a3458a900401d93f99197abc69e3baaf3038a89e74d142344fbf0d9ff5" exitCode=0 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985651 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerID="aff586e7c8306a470164e6d1603b7a84b79e22ff53f7871cff535736f72f77b8" exitCode=0 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985660 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerID="1103e45d1299bd7cc9890cc70e1b35be3c7e5cdc36cdc23191cb32c65b6851af" exitCode=0 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985670 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerID="54fecd80df24f20c923283f6966a565b8cf9cee51d2194836164df5fc69600b8" exitCode=143 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985681 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerID="b912acee2b3fec4fd1d0704a94a867e79b9191286159220760027325f0709c51" exitCode=143 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985703 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"f3e810b92c533dbff0b37232e3b59d6146e02214a9506edd851862a6737312a5"} Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985737 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"38f5a9a3458a900401d93f99197abc69e3baaf3038a89e74d142344fbf0d9ff5"} Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985757 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"aff586e7c8306a470164e6d1603b7a84b79e22ff53f7871cff535736f72f77b8"} Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985766 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"1103e45d1299bd7cc9890cc70e1b35be3c7e5cdc36cdc23191cb32c65b6851af"} Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985779 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"54fecd80df24f20c923283f6966a565b8cf9cee51d2194836164df5fc69600b8"} Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985787 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"b912acee2b3fec4fd1d0704a94a867e79b9191286159220760027325f0709c51"} Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.985802 4886 scope.go:117] "RemoveContainer" containerID="a0641acb8929ee41033e4169acb367c2a8a89a440e89fc29dde22190651e439f" Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.987550 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4dstj_b415d17e-f329-40e7-8a3f-32881cb5347a/kube-multus/2.log" Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.988040 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4dstj_b415d17e-f329-40e7-8a3f-32881cb5347a/kube-multus/1.log" Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.988094 4886 generic.go:334] "Generic (PLEG): container finished" podID="b415d17e-f329-40e7-8a3f-32881cb5347a" containerID="e74f1c8b65fe500a145e8a234d995565d439027c89c5aa1da47c13b626c7d606" exitCode=2 Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.988129 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4dstj" event={"ID":"b415d17e-f329-40e7-8a3f-32881cb5347a","Type":"ContainerDied","Data":"e74f1c8b65fe500a145e8a234d995565d439027c89c5aa1da47c13b626c7d606"} Jan 29 16:33:19 crc kubenswrapper[4886]: I0129 16:33:19.988732 4886 scope.go:117] "RemoveContainer" containerID="e74f1c8b65fe500a145e8a234d995565d439027c89c5aa1da47c13b626c7d606" Jan 29 16:33:19 crc kubenswrapper[4886]: E0129 16:33:19.989041 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4dstj_openshift-multus(b415d17e-f329-40e7-8a3f-32881cb5347a)\"" pod="openshift-multus/multus-4dstj" podUID="b415d17e-f329-40e7-8a3f-32881cb5347a" Jan 29 16:33:20 crc kubenswrapper[4886]: I0129 16:33:20.021452 4886 scope.go:117] "RemoveContainer" containerID="0fbf425aaf0e257fa72dc096677e8404be047665a998729a21862b66d4162248" Jan 29 16:33:20 crc kubenswrapper[4886]: I0129 16:33:20.997218 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4dstj_b415d17e-f329-40e7-8a3f-32881cb5347a/kube-multus/2.log" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.003323 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovn-acl-logging/0.log" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.003944 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovn-controller/0.log" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.004488 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerID="db747d554077a641bca85a4b376af5cc3ebe9e9addb59303e40961567d28422a" exitCode=0 Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.004540 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerID="34083f87301d604fb38ce6765e0d429895295ab0c89f02abfc1cfde1d71f4454" exitCode=0 Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.004574 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"db747d554077a641bca85a4b376af5cc3ebe9e9addb59303e40961567d28422a"} Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.004613 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"34083f87301d604fb38ce6765e0d429895295ab0c89f02abfc1cfde1d71f4454"} Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.340735 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovn-acl-logging/0.log" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.341664 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovn-controller/0.log" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.342386 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.420383 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fm92b"] Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.420769 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.420805 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.420830 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="sbdb" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.420842 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="sbdb" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.420859 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.420871 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.420885 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kubecfg-setup" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.420897 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kubecfg-setup" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.420917 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovn-acl-logging" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.420930 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovn-acl-logging" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.420947 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovn-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.420963 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovn-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.420983 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.420998 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.421023 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fe8de1-76b0-42ad-9f62-53ac51eac78d" containerName="console" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421039 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fe8de1-76b0-42ad-9f62-53ac51eac78d" containerName="console" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.421062 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="nbdb" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421077 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="nbdb" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.421094 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kube-rbac-proxy-node" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421110 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kube-rbac-proxy-node" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.421140 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="northd" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421157 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="northd" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.421183 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421202 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.421222 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421239 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.421259 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18290a86-b94a-42c5-9f50-1614077f881b" containerName="collect-profiles" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421275 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="18290a86-b94a-42c5-9f50-1614077f881b" containerName="collect-profiles" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421508 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421532 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421548 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="18290a86-b94a-42c5-9f50-1614077f881b" containerName="collect-profiles" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421592 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="sbdb" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421614 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovn-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421629 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovn-acl-logging" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421644 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421658 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="kube-rbac-proxy-node" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421675 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="northd" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421691 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fe8de1-76b0-42ad-9f62-53ac51eac78d" containerName="console" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421713 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="nbdb" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421731 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.421924 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.421941 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.422131 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.422514 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" containerName="ovnkube-controller" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.424909 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504556 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-ovn\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504629 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-node-log\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504668 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-run-ovn-kubernetes\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504710 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504711 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504796 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-node-log" (OuterVolumeSpecName: "node-log") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504738 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504830 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504766 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504782 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-cni-netd\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504932 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovnkube-script-lib\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504970 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-systemd-units\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.504998 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-openvswitch\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505031 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-var-lib-openvswitch\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505062 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505076 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-env-overrides\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505100 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505105 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505112 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-etc-openvswitch\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505148 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505208 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-slash\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505266 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-systemd\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505303 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-run-netns\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505368 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-slash" (OuterVolumeSpecName: "host-slash") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505462 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8f8x\" (UniqueName: \"kubernetes.io/projected/d46238ab-90d4-41b8-b546-6dbff06cf5ed-kube-api-access-h8f8x\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505533 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-log-socket\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505570 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-kubelet\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505587 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505578 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505661 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-log-socket" (OuterVolumeSpecName: "log-socket") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505645 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505630 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovn-node-metrics-cert\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505765 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovnkube-config\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505794 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505828 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-cni-bin\") pod \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\" (UID: \"d46238ab-90d4-41b8-b546-6dbff06cf5ed\") " Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.505935 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506291 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506452 4886 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506492 4886 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506513 4886 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506531 4886 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506547 4886 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506562 4886 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506578 4886 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506594 4886 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-slash\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506609 4886 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506625 4886 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-log-socket\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506640 4886 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506656 4886 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506674 4886 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506690 4886 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506705 4886 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-node-log\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506721 4886 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.506741 4886 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.513010 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.513137 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46238ab-90d4-41b8-b546-6dbff06cf5ed-kube-api-access-h8f8x" (OuterVolumeSpecName: "kube-api-access-h8f8x") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "kube-api-access-h8f8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.536260 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d46238ab-90d4-41b8-b546-6dbff06cf5ed" (UID: "d46238ab-90d4-41b8-b546-6dbff06cf5ed"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.608128 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-systemd-units\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.608484 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-var-lib-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.608725 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.608936 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609109 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwf9m\" (UniqueName: \"kubernetes.io/projected/19111cdf-053c-4093-af99-ad30edda5ec8-kube-api-access-qwf9m\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609259 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19111cdf-053c-4093-af99-ad30edda5ec8-ovn-node-metrics-cert\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609620 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-ovnkube-script-lib\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609762 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-run-netns\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609807 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-slash\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609876 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-etc-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609929 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-cni-bin\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609960 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-ovn\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.609978 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-cni-netd\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610000 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-log-socket\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610024 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-node-log\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610044 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-systemd\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610084 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-kubelet\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610102 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-ovnkube-config\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610150 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610180 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-env-overrides\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610252 4886 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d46238ab-90d4-41b8-b546-6dbff06cf5ed-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610269 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8f8x\" (UniqueName: \"kubernetes.io/projected/d46238ab-90d4-41b8-b546-6dbff06cf5ed-kube-api-access-h8f8x\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.610282 4886 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d46238ab-90d4-41b8-b546-6dbff06cf5ed-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:33:21 crc kubenswrapper[4886]: E0129 16:33:21.619230 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711548 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711615 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711641 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwf9m\" (UniqueName: \"kubernetes.io/projected/19111cdf-053c-4093-af99-ad30edda5ec8-kube-api-access-qwf9m\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711666 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19111cdf-053c-4093-af99-ad30edda5ec8-ovn-node-metrics-cert\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711687 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-ovnkube-script-lib\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711719 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-run-netns\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711745 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-slash\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711784 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-etc-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711826 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-cni-bin\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711845 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-ovn\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711838 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711904 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-cni-netd\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.711863 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-cni-netd\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712005 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-log-socket\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712069 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-node-log\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712126 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-systemd\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712200 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-ovnkube-config\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712250 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-kubelet\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712388 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712434 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-log-socket\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712490 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-kubelet\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712510 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712437 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712391 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-node-log\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712569 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-systemd\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712591 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-etc-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712644 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-run-netns\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712686 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-slash\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712703 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-ovnkube-script-lib\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712443 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-env-overrides\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712751 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-var-lib-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712775 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-systemd-units\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712841 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-systemd-units\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712917 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-run-ovn\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712943 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-var-lib-openvswitch\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.712752 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19111cdf-053c-4093-af99-ad30edda5ec8-host-cni-bin\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.713187 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-env-overrides\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.713635 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19111cdf-053c-4093-af99-ad30edda5ec8-ovnkube-config\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.715364 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19111cdf-053c-4093-af99-ad30edda5ec8-ovn-node-metrics-cert\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.728796 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwf9m\" (UniqueName: \"kubernetes.io/projected/19111cdf-053c-4093-af99-ad30edda5ec8-kube-api-access-qwf9m\") pod \"ovnkube-node-fm92b\" (UID: \"19111cdf-053c-4093-af99-ad30edda5ec8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: I0129 16:33:21.747762 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:21 crc kubenswrapper[4886]: W0129 16:33:21.776673 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19111cdf_053c_4093_af99_ad30edda5ec8.slice/crio-135b31cd61eff929a85b1b414c60bba00d2e4e06835617685664137fa559c05d WatchSource:0}: Error finding container 135b31cd61eff929a85b1b414c60bba00d2e4e06835617685664137fa559c05d: Status 404 returned error can't find the container with id 135b31cd61eff929a85b1b414c60bba00d2e4e06835617685664137fa559c05d Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.012752 4886 generic.go:334] "Generic (PLEG): container finished" podID="19111cdf-053c-4093-af99-ad30edda5ec8" containerID="f683edcd1501f89a0b295a6a611bc59d07cfb788312c5f3e8fcb1155e41df8d2" exitCode=0 Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.012886 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerDied","Data":"f683edcd1501f89a0b295a6a611bc59d07cfb788312c5f3e8fcb1155e41df8d2"} Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.013160 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"135b31cd61eff929a85b1b414c60bba00d2e4e06835617685664137fa559c05d"} Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.020503 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovn-acl-logging/0.log" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.021248 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsnwn_d46238ab-90d4-41b8-b546-6dbff06cf5ed/ovn-controller/0.log" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.021975 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" event={"ID":"d46238ab-90d4-41b8-b546-6dbff06cf5ed","Type":"ContainerDied","Data":"4945a9e8ab72e79012e84ebf83643f2ee2b4c4028b579b7a2f7381c763968861"} Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.022041 4886 scope.go:117] "RemoveContainer" containerID="f3e810b92c533dbff0b37232e3b59d6146e02214a9506edd851862a6737312a5" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.022256 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bsnwn" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.065784 4886 scope.go:117] "RemoveContainer" containerID="38f5a9a3458a900401d93f99197abc69e3baaf3038a89e74d142344fbf0d9ff5" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.074483 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bsnwn"] Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.079172 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bsnwn"] Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.106209 4886 scope.go:117] "RemoveContainer" containerID="aff586e7c8306a470164e6d1603b7a84b79e22ff53f7871cff535736f72f77b8" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.127846 4886 scope.go:117] "RemoveContainer" containerID="1103e45d1299bd7cc9890cc70e1b35be3c7e5cdc36cdc23191cb32c65b6851af" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.144584 4886 scope.go:117] "RemoveContainer" containerID="db747d554077a641bca85a4b376af5cc3ebe9e9addb59303e40961567d28422a" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.166978 4886 scope.go:117] "RemoveContainer" containerID="34083f87301d604fb38ce6765e0d429895295ab0c89f02abfc1cfde1d71f4454" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.193166 4886 scope.go:117] "RemoveContainer" containerID="54fecd80df24f20c923283f6966a565b8cf9cee51d2194836164df5fc69600b8" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.215230 4886 scope.go:117] "RemoveContainer" containerID="b912acee2b3fec4fd1d0704a94a867e79b9191286159220760027325f0709c51" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.232115 4886 scope.go:117] "RemoveContainer" containerID="f18adfac47665579e806165f73793a4a301dcd95317ce1ac58ab8c4551aab72b" Jan 29 16:33:22 crc kubenswrapper[4886]: E0129 16:33:22.618782 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:33:22 crc kubenswrapper[4886]: I0129 16:33:22.639979 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46238ab-90d4-41b8-b546-6dbff06cf5ed" path="/var/lib/kubelet/pods/d46238ab-90d4-41b8-b546-6dbff06cf5ed/volumes" Jan 29 16:33:23 crc kubenswrapper[4886]: I0129 16:33:23.031169 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"58729ca7ba88813b953ef04ed4a802c907de57bbeacf683e7b8182a8761c8104"} Jan 29 16:33:23 crc kubenswrapper[4886]: I0129 16:33:23.031218 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"04b3572bac5235c653957af0253cc167698dffb3f729f9847af808b432395b10"} Jan 29 16:33:23 crc kubenswrapper[4886]: I0129 16:33:23.031230 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"3f9ce34cbaedc840b516dff954995b2e4306927f416cc6d1c3da421fec5b8c77"} Jan 29 16:33:23 crc kubenswrapper[4886]: I0129 16:33:23.031242 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"05408936123b4483b582625e2140810b7656b3c4c86da278a698264562d7a238"} Jan 29 16:33:23 crc kubenswrapper[4886]: I0129 16:33:23.031255 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"992a9a12d58b004fc5045fa851c0c5d8ddfd906aa6008b79e78cadc867d9eb25"} Jan 29 16:33:23 crc kubenswrapper[4886]: I0129 16:33:23.031266 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"6c5f1c457e473a871404a2ebb2563ac4ed21af223cc41661b3e680b2020432cb"} Jan 29 16:33:23 crc kubenswrapper[4886]: E0129 16:33:23.616441 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:33:25 crc kubenswrapper[4886]: I0129 16:33:25.050388 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"d145893750c9188369a0cc42d9e3cf847a7ab3852baf0847c0f303fd783f77b5"} Jan 29 16:33:26 crc kubenswrapper[4886]: E0129 16:33:26.618683 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:33:28 crc kubenswrapper[4886]: I0129 16:33:28.073977 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" event={"ID":"19111cdf-053c-4093-af99-ad30edda5ec8","Type":"ContainerStarted","Data":"fc15846ede655b62a2d71f8d125d4b6bdac031667a3d448eb4b594a8415eaca0"} Jan 29 16:33:28 crc kubenswrapper[4886]: I0129 16:33:28.074229 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:28 crc kubenswrapper[4886]: I0129 16:33:28.074374 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:28 crc kubenswrapper[4886]: I0129 16:33:28.099462 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:28 crc kubenswrapper[4886]: I0129 16:33:28.147959 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" podStartSLOduration=7.147937595 podStartE2EDuration="7.147937595s" podCreationTimestamp="2026-01-29 16:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:33:28.105685099 +0000 UTC m=+691.014404381" watchObservedRunningTime="2026-01-29 16:33:28.147937595 +0000 UTC m=+691.056656887" Jan 29 16:33:29 crc kubenswrapper[4886]: I0129 16:33:29.085718 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:29 crc kubenswrapper[4886]: I0129 16:33:29.111776 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:32 crc kubenswrapper[4886]: E0129 16:33:32.617918 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:33:33 crc kubenswrapper[4886]: I0129 16:33:33.615905 4886 scope.go:117] "RemoveContainer" containerID="e74f1c8b65fe500a145e8a234d995565d439027c89c5aa1da47c13b626c7d606" Jan 29 16:33:33 crc kubenswrapper[4886]: E0129 16:33:33.616574 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4dstj_openshift-multus(b415d17e-f329-40e7-8a3f-32881cb5347a)\"" pod="openshift-multus/multus-4dstj" podUID="b415d17e-f329-40e7-8a3f-32881cb5347a" Jan 29 16:33:34 crc kubenswrapper[4886]: E0129 16:33:34.736917 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 16:33:34 crc kubenswrapper[4886]: E0129 16:33:34.737399 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jfv6k_openshift-marketplace(69003a39-1c09-4087-a494-ebfd69e973cf): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:33:34 crc kubenswrapper[4886]: E0129 16:33:34.738873 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:33:35 crc kubenswrapper[4886]: E0129 16:33:35.617897 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" Jan 29 16:33:40 crc kubenswrapper[4886]: E0129 16:33:40.618998 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:33:43 crc kubenswrapper[4886]: E0129 16:33:43.617382 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" Jan 29 16:33:44 crc kubenswrapper[4886]: I0129 16:33:44.615217 4886 scope.go:117] "RemoveContainer" containerID="e74f1c8b65fe500a145e8a234d995565d439027c89c5aa1da47c13b626c7d606" Jan 29 16:33:45 crc kubenswrapper[4886]: I0129 16:33:45.194228 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4dstj_b415d17e-f329-40e7-8a3f-32881cb5347a/kube-multus/2.log" Jan 29 16:33:45 crc kubenswrapper[4886]: I0129 16:33:45.194705 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4dstj" event={"ID":"b415d17e-f329-40e7-8a3f-32881cb5347a","Type":"ContainerStarted","Data":"f9b217aab06574ff3e962be323a2a8a06c95f4a16fa9897a5196355d9fc68145"} Jan 29 16:33:48 crc kubenswrapper[4886]: E0129 16:33:48.626229 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:33:51 crc kubenswrapper[4886]: I0129 16:33:51.232920 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qbl4" event={"ID":"57aa9115-b2d5-45aa-8ac3-e251c0907e45","Type":"ContainerStarted","Data":"d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4"} Jan 29 16:33:51 crc kubenswrapper[4886]: I0129 16:33:51.776460 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fm92b" Jan 29 16:33:52 crc kubenswrapper[4886]: I0129 16:33:52.900217 4886 generic.go:334] "Generic (PLEG): container finished" podID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerID="d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4" exitCode=0 Jan 29 16:33:52 crc kubenswrapper[4886]: I0129 16:33:52.900305 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qbl4" event={"ID":"57aa9115-b2d5-45aa-8ac3-e251c0907e45","Type":"ContainerDied","Data":"d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4"} Jan 29 16:33:53 crc kubenswrapper[4886]: E0129 16:33:53.735076 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 16:33:53 crc kubenswrapper[4886]: E0129 16:33:53.735598 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn92n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zkk68_openshift-marketplace(d84ce3e9-c41a-4a08-8d86-2a918d5e9450): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:33:53 crc kubenswrapper[4886]: E0129 16:33:53.736797 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:33:53 crc kubenswrapper[4886]: I0129 16:33:53.907213 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qbl4" event={"ID":"57aa9115-b2d5-45aa-8ac3-e251c0907e45","Type":"ContainerStarted","Data":"26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d"} Jan 29 16:33:53 crc kubenswrapper[4886]: I0129 16:33:53.926927 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4qbl4" podStartSLOduration=2.240297018 podStartE2EDuration="5m47.92690452s" podCreationTimestamp="2026-01-29 16:28:06 +0000 UTC" firstStartedPulling="2026-01-29 16:28:07.715205598 +0000 UTC m=+370.623924870" lastFinishedPulling="2026-01-29 16:33:53.4018131 +0000 UTC m=+716.310532372" observedRunningTime="2026-01-29 16:33:53.923442004 +0000 UTC m=+716.832161276" watchObservedRunningTime="2026-01-29 16:33:53.92690452 +0000 UTC m=+716.835623812" Jan 29 16:33:55 crc kubenswrapper[4886]: I0129 16:33:55.931513 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5hs7" event={"ID":"a7325ad0-28bf-45e0-bbd5-160f441de091","Type":"ContainerStarted","Data":"35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88"} Jan 29 16:33:56 crc kubenswrapper[4886]: I0129 16:33:56.942809 4886 generic.go:334] "Generic (PLEG): container finished" podID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerID="35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88" exitCode=0 Jan 29 16:33:56 crc kubenswrapper[4886]: I0129 16:33:56.942866 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5hs7" event={"ID":"a7325ad0-28bf-45e0-bbd5-160f441de091","Type":"ContainerDied","Data":"35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88"} Jan 29 16:33:57 crc kubenswrapper[4886]: I0129 16:33:57.016994 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:33:57 crc kubenswrapper[4886]: I0129 16:33:57.017067 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:33:57 crc kubenswrapper[4886]: I0129 16:33:57.087717 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:33:57 crc kubenswrapper[4886]: I0129 16:33:57.953314 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5hs7" event={"ID":"a7325ad0-28bf-45e0-bbd5-160f441de091","Type":"ContainerStarted","Data":"efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e"} Jan 29 16:33:58 crc kubenswrapper[4886]: I0129 16:33:58.001527 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q5hs7" podStartSLOduration=3.35713132 podStartE2EDuration="5m54.001507629s" podCreationTimestamp="2026-01-29 16:28:04 +0000 UTC" firstStartedPulling="2026-01-29 16:28:06.706720432 +0000 UTC m=+369.615439704" lastFinishedPulling="2026-01-29 16:33:57.351096731 +0000 UTC m=+720.259816013" observedRunningTime="2026-01-29 16:33:57.998030063 +0000 UTC m=+720.906749355" watchObservedRunningTime="2026-01-29 16:33:58.001507629 +0000 UTC m=+720.910226921" Jan 29 16:33:59 crc kubenswrapper[4886]: E0129 16:33:59.616685 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:34:05 crc kubenswrapper[4886]: I0129 16:34:05.192131 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:34:05 crc kubenswrapper[4886]: I0129 16:34:05.192439 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:34:05 crc kubenswrapper[4886]: I0129 16:34:05.249205 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:34:06 crc kubenswrapper[4886]: I0129 16:34:06.085686 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:34:07 crc kubenswrapper[4886]: I0129 16:34:07.069820 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:34:07 crc kubenswrapper[4886]: E0129 16:34:07.620783 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:34:12 crc kubenswrapper[4886]: E0129 16:34:12.617531 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:34:18 crc kubenswrapper[4886]: E0129 16:34:18.626121 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:34:25 crc kubenswrapper[4886]: E0129 16:34:25.618134 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:34:29 crc kubenswrapper[4886]: I0129 16:34:29.661088 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:34:29 crc kubenswrapper[4886]: I0129 16:34:29.661470 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:34:30 crc kubenswrapper[4886]: E0129 16:34:30.618480 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:34:38 crc kubenswrapper[4886]: E0129 16:34:38.623491 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:34:43 crc kubenswrapper[4886]: E0129 16:34:43.618121 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:34:48 crc kubenswrapper[4886]: I0129 16:34:48.195944 4886 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 29 16:34:52 crc kubenswrapper[4886]: E0129 16:34:52.618701 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:34:57 crc kubenswrapper[4886]: E0129 16:34:57.618116 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:34:59 crc kubenswrapper[4886]: I0129 16:34:59.661132 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:34:59 crc kubenswrapper[4886]: I0129 16:34:59.661224 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:35:04 crc kubenswrapper[4886]: E0129 16:35:04.618399 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:35:08 crc kubenswrapper[4886]: E0129 16:35:08.621257 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:35:18 crc kubenswrapper[4886]: E0129 16:35:18.621769 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:35:22 crc kubenswrapper[4886]: E0129 16:35:22.618683 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:35:29 crc kubenswrapper[4886]: I0129 16:35:29.660860 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:35:29 crc kubenswrapper[4886]: I0129 16:35:29.661734 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:35:29 crc kubenswrapper[4886]: I0129 16:35:29.661800 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:35:29 crc kubenswrapper[4886]: I0129 16:35:29.662773 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"773fe28c1c2f4b4e6b5a35ea611b7d8ab8f392d8f1b68bb09ec93e5c483b53ed"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:35:29 crc kubenswrapper[4886]: I0129 16:35:29.662863 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://773fe28c1c2f4b4e6b5a35ea611b7d8ab8f392d8f1b68bb09ec93e5c483b53ed" gracePeriod=600 Jan 29 16:35:30 crc kubenswrapper[4886]: E0129 16:35:30.616841 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:35:30 crc kubenswrapper[4886]: I0129 16:35:30.645235 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="773fe28c1c2f4b4e6b5a35ea611b7d8ab8f392d8f1b68bb09ec93e5c483b53ed" exitCode=0 Jan 29 16:35:30 crc kubenswrapper[4886]: I0129 16:35:30.645284 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"773fe28c1c2f4b4e6b5a35ea611b7d8ab8f392d8f1b68bb09ec93e5c483b53ed"} Jan 29 16:35:30 crc kubenswrapper[4886]: I0129 16:35:30.645347 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"50ba5c9bdbddc145f7d20c044a7cd326eb16e00aa141bfc3e8c4f610ef31ae97"} Jan 29 16:35:30 crc kubenswrapper[4886]: I0129 16:35:30.645365 4886 scope.go:117] "RemoveContainer" containerID="ae7876e7e5e026deccf52515d738eb4b775938bb13eef71ab45573508b57aaa0" Jan 29 16:35:34 crc kubenswrapper[4886]: E0129 16:35:34.618869 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.293060 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b9tgx"] Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.296298 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.309448 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9tgx"] Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.385760 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-utilities\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.386065 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-catalog-content\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.386218 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b96x9\" (UniqueName: \"kubernetes.io/projected/73caa1a0-803a-489b-925a-62f4c7d85295-kube-api-access-b96x9\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.488043 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-utilities\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.488526 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-catalog-content\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.488784 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b96x9\" (UniqueName: \"kubernetes.io/projected/73caa1a0-803a-489b-925a-62f4c7d85295-kube-api-access-b96x9\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.488809 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-utilities\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.489455 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-catalog-content\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.516177 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b96x9\" (UniqueName: \"kubernetes.io/projected/73caa1a0-803a-489b-925a-62f4c7d85295-kube-api-access-b96x9\") pod \"redhat-marketplace-b9tgx\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.626600 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:35:40 crc kubenswrapper[4886]: I0129 16:35:40.855156 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9tgx"] Jan 29 16:35:41 crc kubenswrapper[4886]: I0129 16:35:41.730466 4886 generic.go:334] "Generic (PLEG): container finished" podID="73caa1a0-803a-489b-925a-62f4c7d85295" containerID="0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8" exitCode=0 Jan 29 16:35:41 crc kubenswrapper[4886]: I0129 16:35:41.730529 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9tgx" event={"ID":"73caa1a0-803a-489b-925a-62f4c7d85295","Type":"ContainerDied","Data":"0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8"} Jan 29 16:35:41 crc kubenswrapper[4886]: I0129 16:35:41.730591 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9tgx" event={"ID":"73caa1a0-803a-489b-925a-62f4c7d85295","Type":"ContainerStarted","Data":"6b3bcf1eb7b421af2b3c1dea2211d6c94e3f2fcb7c357bd69518e7c58f34f4f8"} Jan 29 16:35:41 crc kubenswrapper[4886]: E0129 16:35:41.881937 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:35:41 crc kubenswrapper[4886]: E0129 16:35:41.882646 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b96x9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-b9tgx_openshift-marketplace(73caa1a0-803a-489b-925a-62f4c7d85295): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:35:41 crc kubenswrapper[4886]: E0129 16:35:41.883992 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-b9tgx" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" Jan 29 16:35:42 crc kubenswrapper[4886]: E0129 16:35:42.742183 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-b9tgx" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" Jan 29 16:35:43 crc kubenswrapper[4886]: E0129 16:35:43.616770 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:35:46 crc kubenswrapper[4886]: E0129 16:35:46.619544 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:35:53 crc kubenswrapper[4886]: I0129 16:35:53.620014 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 16:35:54 crc kubenswrapper[4886]: I0129 16:35:54.858953 4886 generic.go:334] "Generic (PLEG): container finished" podID="73caa1a0-803a-489b-925a-62f4c7d85295" containerID="c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843" exitCode=0 Jan 29 16:35:54 crc kubenswrapper[4886]: I0129 16:35:54.859066 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9tgx" event={"ID":"73caa1a0-803a-489b-925a-62f4c7d85295","Type":"ContainerDied","Data":"c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843"} Jan 29 16:35:55 crc kubenswrapper[4886]: I0129 16:35:55.871056 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9tgx" event={"ID":"73caa1a0-803a-489b-925a-62f4c7d85295","Type":"ContainerStarted","Data":"1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6"} Jan 29 16:35:55 crc kubenswrapper[4886]: I0129 16:35:55.897476 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b9tgx" podStartSLOduration=2.353241804 podStartE2EDuration="15.897456928s" podCreationTimestamp="2026-01-29 16:35:40 +0000 UTC" firstStartedPulling="2026-01-29 16:35:41.732577828 +0000 UTC m=+824.641297130" lastFinishedPulling="2026-01-29 16:35:55.276792972 +0000 UTC m=+838.185512254" observedRunningTime="2026-01-29 16:35:55.895453483 +0000 UTC m=+838.804172775" watchObservedRunningTime="2026-01-29 16:35:55.897456928 +0000 UTC m=+838.806176200" Jan 29 16:35:56 crc kubenswrapper[4886]: E0129 16:35:56.616749 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:36:00 crc kubenswrapper[4886]: I0129 16:36:00.631181 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:36:00 crc kubenswrapper[4886]: I0129 16:36:00.632149 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:36:00 crc kubenswrapper[4886]: I0129 16:36:00.683450 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:36:00 crc kubenswrapper[4886]: I0129 16:36:00.945010 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:36:01 crc kubenswrapper[4886]: E0129 16:36:01.619301 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.068222 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9tgx"] Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.068507 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b9tgx" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" containerName="registry-server" containerID="cri-o://1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6" gracePeriod=2 Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.471698 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.549261 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-utilities\") pod \"73caa1a0-803a-489b-925a-62f4c7d85295\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.549407 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-catalog-content\") pod \"73caa1a0-803a-489b-925a-62f4c7d85295\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.549499 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b96x9\" (UniqueName: \"kubernetes.io/projected/73caa1a0-803a-489b-925a-62f4c7d85295-kube-api-access-b96x9\") pod \"73caa1a0-803a-489b-925a-62f4c7d85295\" (UID: \"73caa1a0-803a-489b-925a-62f4c7d85295\") " Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.551464 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-utilities" (OuterVolumeSpecName: "utilities") pod "73caa1a0-803a-489b-925a-62f4c7d85295" (UID: "73caa1a0-803a-489b-925a-62f4c7d85295"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.555595 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73caa1a0-803a-489b-925a-62f4c7d85295-kube-api-access-b96x9" (OuterVolumeSpecName: "kube-api-access-b96x9") pod "73caa1a0-803a-489b-925a-62f4c7d85295" (UID: "73caa1a0-803a-489b-925a-62f4c7d85295"). InnerVolumeSpecName "kube-api-access-b96x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.589506 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73caa1a0-803a-489b-925a-62f4c7d85295" (UID: "73caa1a0-803a-489b-925a-62f4c7d85295"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.651737 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.651889 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b96x9\" (UniqueName: \"kubernetes.io/projected/73caa1a0-803a-489b-925a-62f4c7d85295-kube-api-access-b96x9\") on node \"crc\" DevicePath \"\"" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.652044 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73caa1a0-803a-489b-925a-62f4c7d85295-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.927628 4886 generic.go:334] "Generic (PLEG): container finished" podID="73caa1a0-803a-489b-925a-62f4c7d85295" containerID="1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6" exitCode=0 Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.927689 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9tgx" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.927700 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9tgx" event={"ID":"73caa1a0-803a-489b-925a-62f4c7d85295","Type":"ContainerDied","Data":"1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6"} Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.927747 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9tgx" event={"ID":"73caa1a0-803a-489b-925a-62f4c7d85295","Type":"ContainerDied","Data":"6b3bcf1eb7b421af2b3c1dea2211d6c94e3f2fcb7c357bd69518e7c58f34f4f8"} Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.927765 4886 scope.go:117] "RemoveContainer" containerID="1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.962803 4886 scope.go:117] "RemoveContainer" containerID="c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843" Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.970272 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9tgx"] Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.977098 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9tgx"] Jan 29 16:36:03 crc kubenswrapper[4886]: I0129 16:36:03.995284 4886 scope.go:117] "RemoveContainer" containerID="0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8" Jan 29 16:36:04 crc kubenswrapper[4886]: I0129 16:36:04.023710 4886 scope.go:117] "RemoveContainer" containerID="1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6" Jan 29 16:36:04 crc kubenswrapper[4886]: E0129 16:36:04.024259 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6\": container with ID starting with 1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6 not found: ID does not exist" containerID="1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6" Jan 29 16:36:04 crc kubenswrapper[4886]: I0129 16:36:04.024392 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6"} err="failed to get container status \"1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6\": rpc error: code = NotFound desc = could not find container \"1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6\": container with ID starting with 1df3debc9dc32a464a8a01ceb66660fd934db40e0418a44b73501976b98cd6f6 not found: ID does not exist" Jan 29 16:36:04 crc kubenswrapper[4886]: I0129 16:36:04.024542 4886 scope.go:117] "RemoveContainer" containerID="c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843" Jan 29 16:36:04 crc kubenswrapper[4886]: E0129 16:36:04.025045 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843\": container with ID starting with c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843 not found: ID does not exist" containerID="c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843" Jan 29 16:36:04 crc kubenswrapper[4886]: I0129 16:36:04.025081 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843"} err="failed to get container status \"c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843\": rpc error: code = NotFound desc = could not find container \"c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843\": container with ID starting with c0a3c283ef8d7e07ee977dc4f960790916999f5c601d1154dce01509fccc0843 not found: ID does not exist" Jan 29 16:36:04 crc kubenswrapper[4886]: I0129 16:36:04.025109 4886 scope.go:117] "RemoveContainer" containerID="0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8" Jan 29 16:36:04 crc kubenswrapper[4886]: E0129 16:36:04.025425 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8\": container with ID starting with 0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8 not found: ID does not exist" containerID="0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8" Jan 29 16:36:04 crc kubenswrapper[4886]: I0129 16:36:04.025475 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8"} err="failed to get container status \"0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8\": rpc error: code = NotFound desc = could not find container \"0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8\": container with ID starting with 0af281ba22a48525a89d293814700da60b0038002508ae0fe09557b961c806e8 not found: ID does not exist" Jan 29 16:36:04 crc kubenswrapper[4886]: I0129 16:36:04.628574 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" path="/var/lib/kubelet/pods/73caa1a0-803a-489b-925a-62f4c7d85295/volumes" Jan 29 16:36:11 crc kubenswrapper[4886]: E0129 16:36:11.617454 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:36:14 crc kubenswrapper[4886]: E0129 16:36:14.617761 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:36:23 crc kubenswrapper[4886]: E0129 16:36:23.618508 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:36:29 crc kubenswrapper[4886]: E0129 16:36:29.618391 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:36:35 crc kubenswrapper[4886]: E0129 16:36:35.617439 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:36:43 crc kubenswrapper[4886]: E0129 16:36:43.669449 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:36:47 crc kubenswrapper[4886]: E0129 16:36:47.618809 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:36:54 crc kubenswrapper[4886]: E0129 16:36:54.617979 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:37:01 crc kubenswrapper[4886]: E0129 16:37:01.618066 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:37:08 crc kubenswrapper[4886]: E0129 16:37:08.619929 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:37:12 crc kubenswrapper[4886]: E0129 16:37:12.617414 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:37:20 crc kubenswrapper[4886]: E0129 16:37:20.618406 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:37:26 crc kubenswrapper[4886]: E0129 16:37:26.621001 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:37:29 crc kubenswrapper[4886]: I0129 16:37:29.661167 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:37:29 crc kubenswrapper[4886]: I0129 16:37:29.661830 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:37:31 crc kubenswrapper[4886]: E0129 16:37:31.617282 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:37:37 crc kubenswrapper[4886]: E0129 16:37:37.619726 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:37:45 crc kubenswrapper[4886]: I0129 16:37:45.895999 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kfgdj"] Jan 29 16:37:45 crc kubenswrapper[4886]: E0129 16:37:45.897059 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" containerName="extract-utilities" Jan 29 16:37:45 crc kubenswrapper[4886]: I0129 16:37:45.897085 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" containerName="extract-utilities" Jan 29 16:37:45 crc kubenswrapper[4886]: E0129 16:37:45.897116 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" containerName="registry-server" Jan 29 16:37:45 crc kubenswrapper[4886]: I0129 16:37:45.897130 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" containerName="registry-server" Jan 29 16:37:45 crc kubenswrapper[4886]: E0129 16:37:45.897156 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" containerName="extract-content" Jan 29 16:37:45 crc kubenswrapper[4886]: I0129 16:37:45.897171 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" containerName="extract-content" Jan 29 16:37:45 crc kubenswrapper[4886]: I0129 16:37:45.897516 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="73caa1a0-803a-489b-925a-62f4c7d85295" containerName="registry-server" Jan 29 16:37:45 crc kubenswrapper[4886]: I0129 16:37:45.899553 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:45 crc kubenswrapper[4886]: I0129 16:37:45.913679 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfgdj"] Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.036794 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-catalog-content\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.036855 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-utilities\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.036887 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt878\" (UniqueName: \"kubernetes.io/projected/b4408259-440c-4434-ad5e-df143591092f-kube-api-access-qt878\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.138404 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-utilities\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.138536 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt878\" (UniqueName: \"kubernetes.io/projected/b4408259-440c-4434-ad5e-df143591092f-kube-api-access-qt878\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.138685 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-catalog-content\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.139664 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-utilities\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.139695 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-catalog-content\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.172612 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt878\" (UniqueName: \"kubernetes.io/projected/b4408259-440c-4434-ad5e-df143591092f-kube-api-access-qt878\") pod \"certified-operators-kfgdj\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.231984 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.451094 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfgdj"] Jan 29 16:37:46 crc kubenswrapper[4886]: E0129 16:37:46.615850 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.746141 4886 generic.go:334] "Generic (PLEG): container finished" podID="b4408259-440c-4434-ad5e-df143591092f" containerID="d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7" exitCode=0 Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.746199 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfgdj" event={"ID":"b4408259-440c-4434-ad5e-df143591092f","Type":"ContainerDied","Data":"d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7"} Jan 29 16:37:46 crc kubenswrapper[4886]: I0129 16:37:46.746281 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfgdj" event={"ID":"b4408259-440c-4434-ad5e-df143591092f","Type":"ContainerStarted","Data":"65cd5bcca908b0b496f52d5ad6cc1abf1980809b3bca9141ebb7782171f5ef55"} Jan 29 16:37:48 crc kubenswrapper[4886]: I0129 16:37:48.765581 4886 generic.go:334] "Generic (PLEG): container finished" podID="b4408259-440c-4434-ad5e-df143591092f" containerID="4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a" exitCode=0 Jan 29 16:37:48 crc kubenswrapper[4886]: I0129 16:37:48.765646 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfgdj" event={"ID":"b4408259-440c-4434-ad5e-df143591092f","Type":"ContainerDied","Data":"4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a"} Jan 29 16:37:49 crc kubenswrapper[4886]: I0129 16:37:49.788560 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfgdj" event={"ID":"b4408259-440c-4434-ad5e-df143591092f","Type":"ContainerStarted","Data":"7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee"} Jan 29 16:37:49 crc kubenswrapper[4886]: I0129 16:37:49.815538 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kfgdj" podStartSLOduration=2.343172752 podStartE2EDuration="4.815515761s" podCreationTimestamp="2026-01-29 16:37:45 +0000 UTC" firstStartedPulling="2026-01-29 16:37:46.748088802 +0000 UTC m=+949.656808084" lastFinishedPulling="2026-01-29 16:37:49.220431821 +0000 UTC m=+952.129151093" observedRunningTime="2026-01-29 16:37:49.811788192 +0000 UTC m=+952.720507464" watchObservedRunningTime="2026-01-29 16:37:49.815515761 +0000 UTC m=+952.724235043" Jan 29 16:37:52 crc kubenswrapper[4886]: E0129 16:37:52.617959 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:37:56 crc kubenswrapper[4886]: I0129 16:37:56.232884 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:56 crc kubenswrapper[4886]: I0129 16:37:56.234666 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:56 crc kubenswrapper[4886]: I0129 16:37:56.302904 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:56 crc kubenswrapper[4886]: I0129 16:37:56.897699 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:58 crc kubenswrapper[4886]: E0129 16:37:58.623586 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:37:58 crc kubenswrapper[4886]: I0129 16:37:58.678040 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfgdj"] Jan 29 16:37:58 crc kubenswrapper[4886]: I0129 16:37:58.853005 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kfgdj" podUID="b4408259-440c-4434-ad5e-df143591092f" containerName="registry-server" containerID="cri-o://7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee" gracePeriod=2 Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.660843 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.660897 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.716102 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.845744 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-utilities\") pod \"b4408259-440c-4434-ad5e-df143591092f\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.845822 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-catalog-content\") pod \"b4408259-440c-4434-ad5e-df143591092f\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.845910 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt878\" (UniqueName: \"kubernetes.io/projected/b4408259-440c-4434-ad5e-df143591092f-kube-api-access-qt878\") pod \"b4408259-440c-4434-ad5e-df143591092f\" (UID: \"b4408259-440c-4434-ad5e-df143591092f\") " Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.847481 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-utilities" (OuterVolumeSpecName: "utilities") pod "b4408259-440c-4434-ad5e-df143591092f" (UID: "b4408259-440c-4434-ad5e-df143591092f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.852427 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4408259-440c-4434-ad5e-df143591092f-kube-api-access-qt878" (OuterVolumeSpecName: "kube-api-access-qt878") pod "b4408259-440c-4434-ad5e-df143591092f" (UID: "b4408259-440c-4434-ad5e-df143591092f"). InnerVolumeSpecName "kube-api-access-qt878". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.861581 4886 generic.go:334] "Generic (PLEG): container finished" podID="b4408259-440c-4434-ad5e-df143591092f" containerID="7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee" exitCode=0 Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.861632 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfgdj" event={"ID":"b4408259-440c-4434-ad5e-df143591092f","Type":"ContainerDied","Data":"7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee"} Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.861662 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfgdj" event={"ID":"b4408259-440c-4434-ad5e-df143591092f","Type":"ContainerDied","Data":"65cd5bcca908b0b496f52d5ad6cc1abf1980809b3bca9141ebb7782171f5ef55"} Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.861682 4886 scope.go:117] "RemoveContainer" containerID="7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.861680 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfgdj" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.903777 4886 scope.go:117] "RemoveContainer" containerID="4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.912433 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4408259-440c-4434-ad5e-df143591092f" (UID: "b4408259-440c-4434-ad5e-df143591092f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.919395 4886 scope.go:117] "RemoveContainer" containerID="d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.947632 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt878\" (UniqueName: \"kubernetes.io/projected/b4408259-440c-4434-ad5e-df143591092f-kube-api-access-qt878\") on node \"crc\" DevicePath \"\"" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.947862 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.947952 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4408259-440c-4434-ad5e-df143591092f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.958201 4886 scope.go:117] "RemoveContainer" containerID="7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee" Jan 29 16:37:59 crc kubenswrapper[4886]: E0129 16:37:59.959016 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee\": container with ID starting with 7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee not found: ID does not exist" containerID="7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.959065 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee"} err="failed to get container status \"7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee\": rpc error: code = NotFound desc = could not find container \"7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee\": container with ID starting with 7f8799db03d44b9bc3afe805b7e6af24b1d2e2fc103b5b76d9aaef5455993dee not found: ID does not exist" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.959096 4886 scope.go:117] "RemoveContainer" containerID="4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a" Jan 29 16:37:59 crc kubenswrapper[4886]: E0129 16:37:59.959565 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a\": container with ID starting with 4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a not found: ID does not exist" containerID="4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.959649 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a"} err="failed to get container status \"4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a\": rpc error: code = NotFound desc = could not find container \"4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a\": container with ID starting with 4a8106271fae12af1142ac5ef147ed049a9212bff974e10331949896bfe2f22a not found: ID does not exist" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.959737 4886 scope.go:117] "RemoveContainer" containerID="d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7" Jan 29 16:37:59 crc kubenswrapper[4886]: E0129 16:37:59.960373 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7\": container with ID starting with d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7 not found: ID does not exist" containerID="d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7" Jan 29 16:37:59 crc kubenswrapper[4886]: I0129 16:37:59.960404 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7"} err="failed to get container status \"d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7\": rpc error: code = NotFound desc = could not find container \"d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7\": container with ID starting with d70d1fce763398b1fbc89d3ba02890b194f9bd437f727f0609064eb7a07084e7 not found: ID does not exist" Jan 29 16:38:00 crc kubenswrapper[4886]: I0129 16:38:00.206990 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfgdj"] Jan 29 16:38:00 crc kubenswrapper[4886]: I0129 16:38:00.211707 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kfgdj"] Jan 29 16:38:00 crc kubenswrapper[4886]: I0129 16:38:00.643540 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4408259-440c-4434-ad5e-df143591092f" path="/var/lib/kubelet/pods/b4408259-440c-4434-ad5e-df143591092f/volumes" Jan 29 16:38:03 crc kubenswrapper[4886]: E0129 16:38:03.617057 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.291346 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzrc9"] Jan 29 16:38:05 crc kubenswrapper[4886]: E0129 16:38:05.291656 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4408259-440c-4434-ad5e-df143591092f" containerName="registry-server" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.291674 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4408259-440c-4434-ad5e-df143591092f" containerName="registry-server" Jan 29 16:38:05 crc kubenswrapper[4886]: E0129 16:38:05.291714 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4408259-440c-4434-ad5e-df143591092f" containerName="extract-utilities" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.291725 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4408259-440c-4434-ad5e-df143591092f" containerName="extract-utilities" Jan 29 16:38:05 crc kubenswrapper[4886]: E0129 16:38:05.291740 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4408259-440c-4434-ad5e-df143591092f" containerName="extract-content" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.291750 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4408259-440c-4434-ad5e-df143591092f" containerName="extract-content" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.291942 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4408259-440c-4434-ad5e-df143591092f" containerName="registry-server" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.293204 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.308220 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzrc9"] Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.430877 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsqr\" (UniqueName: \"kubernetes.io/projected/92af1116-2260-4c2f-a3b2-b3045d51065e-kube-api-access-xfsqr\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.430939 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-utilities\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.430999 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-catalog-content\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.532972 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-catalog-content\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.533425 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsqr\" (UniqueName: \"kubernetes.io/projected/92af1116-2260-4c2f-a3b2-b3045d51065e-kube-api-access-xfsqr\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.533494 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-utilities\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.533664 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-catalog-content\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.534011 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-utilities\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.566662 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsqr\" (UniqueName: \"kubernetes.io/projected/92af1116-2260-4c2f-a3b2-b3045d51065e-kube-api-access-xfsqr\") pod \"community-operators-pzrc9\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.613312 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.801596 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzrc9"] Jan 29 16:38:05 crc kubenswrapper[4886]: I0129 16:38:05.904575 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzrc9" event={"ID":"92af1116-2260-4c2f-a3b2-b3045d51065e","Type":"ContainerStarted","Data":"a5aee8ffafba103be40b015f41604638130a97df1c4c358df1fd18e7cd77f933"} Jan 29 16:38:06 crc kubenswrapper[4886]: I0129 16:38:06.913908 4886 generic.go:334] "Generic (PLEG): container finished" podID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerID="03dc4d084e92a1a4c1b13b14dd33a72a0ff570323d3fe9be1d52ad1281c0cc68" exitCode=0 Jan 29 16:38:06 crc kubenswrapper[4886]: I0129 16:38:06.913972 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzrc9" event={"ID":"92af1116-2260-4c2f-a3b2-b3045d51065e","Type":"ContainerDied","Data":"03dc4d084e92a1a4c1b13b14dd33a72a0ff570323d3fe9be1d52ad1281c0cc68"} Jan 29 16:38:07 crc kubenswrapper[4886]: I0129 16:38:07.923165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzrc9" event={"ID":"92af1116-2260-4c2f-a3b2-b3045d51065e","Type":"ContainerStarted","Data":"e16b246f25ed8a9774fabcbf44fd890ca7a79123170eeee68579d3e84408cbde"} Jan 29 16:38:08 crc kubenswrapper[4886]: I0129 16:38:08.931270 4886 generic.go:334] "Generic (PLEG): container finished" podID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerID="e16b246f25ed8a9774fabcbf44fd890ca7a79123170eeee68579d3e84408cbde" exitCode=0 Jan 29 16:38:08 crc kubenswrapper[4886]: I0129 16:38:08.931487 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzrc9" event={"ID":"92af1116-2260-4c2f-a3b2-b3045d51065e","Type":"ContainerDied","Data":"e16b246f25ed8a9774fabcbf44fd890ca7a79123170eeee68579d3e84408cbde"} Jan 29 16:38:09 crc kubenswrapper[4886]: I0129 16:38:09.940636 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzrc9" event={"ID":"92af1116-2260-4c2f-a3b2-b3045d51065e","Type":"ContainerStarted","Data":"f830e2985c2ce12ffdec81f706a0ce0df2ec836503f1182ff628ddf87c45db60"} Jan 29 16:38:13 crc kubenswrapper[4886]: E0129 16:38:13.616845 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.682869 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzrc9" podStartSLOduration=7.255194957 podStartE2EDuration="9.68285469s" podCreationTimestamp="2026-01-29 16:38:05 +0000 UTC" firstStartedPulling="2026-01-29 16:38:06.918105346 +0000 UTC m=+969.826824628" lastFinishedPulling="2026-01-29 16:38:09.345765089 +0000 UTC m=+972.254484361" observedRunningTime="2026-01-29 16:38:09.969938362 +0000 UTC m=+972.878657654" watchObservedRunningTime="2026-01-29 16:38:14.68285469 +0000 UTC m=+977.591573962" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.684766 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vgrfs"] Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.685890 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.710422 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgrfs"] Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.795395 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-utilities\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.795764 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxpvl\" (UniqueName: \"kubernetes.io/projected/5bc856aa-a27b-4856-a888-7104df47cf30-kube-api-access-hxpvl\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.795941 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-catalog-content\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.897531 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-utilities\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.897612 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxpvl\" (UniqueName: \"kubernetes.io/projected/5bc856aa-a27b-4856-a888-7104df47cf30-kube-api-access-hxpvl\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.897650 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-catalog-content\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.898105 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-catalog-content\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.898403 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-utilities\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:14 crc kubenswrapper[4886]: I0129 16:38:14.916716 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxpvl\" (UniqueName: \"kubernetes.io/projected/5bc856aa-a27b-4856-a888-7104df47cf30-kube-api-access-hxpvl\") pod \"redhat-operators-vgrfs\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:15 crc kubenswrapper[4886]: I0129 16:38:15.012073 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:15 crc kubenswrapper[4886]: I0129 16:38:15.233766 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgrfs"] Jan 29 16:38:15 crc kubenswrapper[4886]: W0129 16:38:15.246239 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bc856aa_a27b_4856_a888_7104df47cf30.slice/crio-642d1d75fa3c4399292c0700dad3ed1dd140aa358c860f9e89f06502f40c5255 WatchSource:0}: Error finding container 642d1d75fa3c4399292c0700dad3ed1dd140aa358c860f9e89f06502f40c5255: Status 404 returned error can't find the container with id 642d1d75fa3c4399292c0700dad3ed1dd140aa358c860f9e89f06502f40c5255 Jan 29 16:38:15 crc kubenswrapper[4886]: I0129 16:38:15.613911 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:15 crc kubenswrapper[4886]: I0129 16:38:15.613971 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:15 crc kubenswrapper[4886]: I0129 16:38:15.659812 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:15 crc kubenswrapper[4886]: I0129 16:38:15.974301 4886 generic.go:334] "Generic (PLEG): container finished" podID="5bc856aa-a27b-4856-a888-7104df47cf30" containerID="f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a" exitCode=0 Jan 29 16:38:15 crc kubenswrapper[4886]: I0129 16:38:15.974354 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrfs" event={"ID":"5bc856aa-a27b-4856-a888-7104df47cf30","Type":"ContainerDied","Data":"f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a"} Jan 29 16:38:15 crc kubenswrapper[4886]: I0129 16:38:15.974397 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrfs" event={"ID":"5bc856aa-a27b-4856-a888-7104df47cf30","Type":"ContainerStarted","Data":"642d1d75fa3c4399292c0700dad3ed1dd140aa358c860f9e89f06502f40c5255"} Jan 29 16:38:16 crc kubenswrapper[4886]: I0129 16:38:16.018585 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:16 crc kubenswrapper[4886]: E0129 16:38:16.617189 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:38:17 crc kubenswrapper[4886]: I0129 16:38:17.988824 4886 generic.go:334] "Generic (PLEG): container finished" podID="5bc856aa-a27b-4856-a888-7104df47cf30" containerID="8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3" exitCode=0 Jan 29 16:38:17 crc kubenswrapper[4886]: I0129 16:38:17.988954 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrfs" event={"ID":"5bc856aa-a27b-4856-a888-7104df47cf30","Type":"ContainerDied","Data":"8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3"} Jan 29 16:38:18 crc kubenswrapper[4886]: I0129 16:38:18.998788 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrfs" event={"ID":"5bc856aa-a27b-4856-a888-7104df47cf30","Type":"ContainerStarted","Data":"b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396"} Jan 29 16:38:19 crc kubenswrapper[4886]: I0129 16:38:19.027948 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vgrfs" podStartSLOduration=2.537051581 podStartE2EDuration="5.027920308s" podCreationTimestamp="2026-01-29 16:38:14 +0000 UTC" firstStartedPulling="2026-01-29 16:38:15.975474954 +0000 UTC m=+978.884194226" lastFinishedPulling="2026-01-29 16:38:18.466343681 +0000 UTC m=+981.375062953" observedRunningTime="2026-01-29 16:38:19.027396003 +0000 UTC m=+981.936115325" watchObservedRunningTime="2026-01-29 16:38:19.027920308 +0000 UTC m=+981.936639610" Jan 29 16:38:19 crc kubenswrapper[4886]: I0129 16:38:19.277954 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzrc9"] Jan 29 16:38:19 crc kubenswrapper[4886]: I0129 16:38:19.278226 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzrc9" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerName="registry-server" containerID="cri-o://f830e2985c2ce12ffdec81f706a0ce0df2ec836503f1182ff628ddf87c45db60" gracePeriod=2 Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.007727 4886 generic.go:334] "Generic (PLEG): container finished" podID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerID="f830e2985c2ce12ffdec81f706a0ce0df2ec836503f1182ff628ddf87c45db60" exitCode=0 Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.008443 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzrc9" event={"ID":"92af1116-2260-4c2f-a3b2-b3045d51065e","Type":"ContainerDied","Data":"f830e2985c2ce12ffdec81f706a0ce0df2ec836503f1182ff628ddf87c45db60"} Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.292110 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.374160 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-catalog-content\") pod \"92af1116-2260-4c2f-a3b2-b3045d51065e\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.433736 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92af1116-2260-4c2f-a3b2-b3045d51065e" (UID: "92af1116-2260-4c2f-a3b2-b3045d51065e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.475128 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsqr\" (UniqueName: \"kubernetes.io/projected/92af1116-2260-4c2f-a3b2-b3045d51065e-kube-api-access-xfsqr\") pod \"92af1116-2260-4c2f-a3b2-b3045d51065e\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.475217 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-utilities\") pod \"92af1116-2260-4c2f-a3b2-b3045d51065e\" (UID: \"92af1116-2260-4c2f-a3b2-b3045d51065e\") " Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.475717 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.475898 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-utilities" (OuterVolumeSpecName: "utilities") pod "92af1116-2260-4c2f-a3b2-b3045d51065e" (UID: "92af1116-2260-4c2f-a3b2-b3045d51065e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.479806 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92af1116-2260-4c2f-a3b2-b3045d51065e-kube-api-access-xfsqr" (OuterVolumeSpecName: "kube-api-access-xfsqr") pod "92af1116-2260-4c2f-a3b2-b3045d51065e" (UID: "92af1116-2260-4c2f-a3b2-b3045d51065e"). InnerVolumeSpecName "kube-api-access-xfsqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.577083 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsqr\" (UniqueName: \"kubernetes.io/projected/92af1116-2260-4c2f-a3b2-b3045d51065e-kube-api-access-xfsqr\") on node \"crc\" DevicePath \"\"" Jan 29 16:38:20 crc kubenswrapper[4886]: I0129 16:38:20.577130 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92af1116-2260-4c2f-a3b2-b3045d51065e-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:38:21 crc kubenswrapper[4886]: I0129 16:38:21.017055 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzrc9" event={"ID":"92af1116-2260-4c2f-a3b2-b3045d51065e","Type":"ContainerDied","Data":"a5aee8ffafba103be40b015f41604638130a97df1c4c358df1fd18e7cd77f933"} Jan 29 16:38:21 crc kubenswrapper[4886]: I0129 16:38:21.017114 4886 scope.go:117] "RemoveContainer" containerID="f830e2985c2ce12ffdec81f706a0ce0df2ec836503f1182ff628ddf87c45db60" Jan 29 16:38:21 crc kubenswrapper[4886]: I0129 16:38:21.017116 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzrc9" Jan 29 16:38:21 crc kubenswrapper[4886]: I0129 16:38:21.040267 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzrc9"] Jan 29 16:38:21 crc kubenswrapper[4886]: I0129 16:38:21.044366 4886 scope.go:117] "RemoveContainer" containerID="e16b246f25ed8a9774fabcbf44fd890ca7a79123170eeee68579d3e84408cbde" Jan 29 16:38:21 crc kubenswrapper[4886]: I0129 16:38:21.045792 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzrc9"] Jan 29 16:38:21 crc kubenswrapper[4886]: I0129 16:38:21.067849 4886 scope.go:117] "RemoveContainer" containerID="03dc4d084e92a1a4c1b13b14dd33a72a0ff570323d3fe9be1d52ad1281c0cc68" Jan 29 16:38:22 crc kubenswrapper[4886]: I0129 16:38:22.621949 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" path="/var/lib/kubelet/pods/92af1116-2260-4c2f-a3b2-b3045d51065e/volumes" Jan 29 16:38:25 crc kubenswrapper[4886]: I0129 16:38:25.012914 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:25 crc kubenswrapper[4886]: I0129 16:38:25.014502 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:25 crc kubenswrapper[4886]: I0129 16:38:25.051480 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:25 crc kubenswrapper[4886]: I0129 16:38:25.105141 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:26 crc kubenswrapper[4886]: E0129 16:38:26.617715 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:38:27 crc kubenswrapper[4886]: I0129 16:38:27.676266 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgrfs"] Jan 29 16:38:27 crc kubenswrapper[4886]: I0129 16:38:27.676739 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vgrfs" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" containerName="registry-server" containerID="cri-o://b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396" gracePeriod=2 Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.032886 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.069050 4886 generic.go:334] "Generic (PLEG): container finished" podID="5bc856aa-a27b-4856-a888-7104df47cf30" containerID="b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396" exitCode=0 Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.069106 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrfs" event={"ID":"5bc856aa-a27b-4856-a888-7104df47cf30","Type":"ContainerDied","Data":"b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396"} Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.069139 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrfs" event={"ID":"5bc856aa-a27b-4856-a888-7104df47cf30","Type":"ContainerDied","Data":"642d1d75fa3c4399292c0700dad3ed1dd140aa358c860f9e89f06502f40c5255"} Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.069160 4886 scope.go:117] "RemoveContainer" containerID="b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.069312 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgrfs" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.087478 4886 scope.go:117] "RemoveContainer" containerID="8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.103017 4886 scope.go:117] "RemoveContainer" containerID="f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.107280 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxpvl\" (UniqueName: \"kubernetes.io/projected/5bc856aa-a27b-4856-a888-7104df47cf30-kube-api-access-hxpvl\") pod \"5bc856aa-a27b-4856-a888-7104df47cf30\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.107441 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-utilities\") pod \"5bc856aa-a27b-4856-a888-7104df47cf30\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.107481 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-catalog-content\") pod \"5bc856aa-a27b-4856-a888-7104df47cf30\" (UID: \"5bc856aa-a27b-4856-a888-7104df47cf30\") " Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.112278 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc856aa-a27b-4856-a888-7104df47cf30-kube-api-access-hxpvl" (OuterVolumeSpecName: "kube-api-access-hxpvl") pod "5bc856aa-a27b-4856-a888-7104df47cf30" (UID: "5bc856aa-a27b-4856-a888-7104df47cf30"). InnerVolumeSpecName "kube-api-access-hxpvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.113165 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-utilities" (OuterVolumeSpecName: "utilities") pod "5bc856aa-a27b-4856-a888-7104df47cf30" (UID: "5bc856aa-a27b-4856-a888-7104df47cf30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.150021 4886 scope.go:117] "RemoveContainer" containerID="b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396" Jan 29 16:38:28 crc kubenswrapper[4886]: E0129 16:38:28.150559 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396\": container with ID starting with b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396 not found: ID does not exist" containerID="b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.150656 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396"} err="failed to get container status \"b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396\": rpc error: code = NotFound desc = could not find container \"b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396\": container with ID starting with b953f483b87cc6eb1e353b30cfe440976c5d7b9acaa026806d5e31600d81f396 not found: ID does not exist" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.150694 4886 scope.go:117] "RemoveContainer" containerID="8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3" Jan 29 16:38:28 crc kubenswrapper[4886]: E0129 16:38:28.151010 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3\": container with ID starting with 8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3 not found: ID does not exist" containerID="8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.151091 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3"} err="failed to get container status \"8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3\": rpc error: code = NotFound desc = could not find container \"8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3\": container with ID starting with 8d0b96bb16d9b428b30612fd0b938c1d4924a8676d599e2c175e0ae963ed72f3 not found: ID does not exist" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.151117 4886 scope.go:117] "RemoveContainer" containerID="f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a" Jan 29 16:38:28 crc kubenswrapper[4886]: E0129 16:38:28.151411 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a\": container with ID starting with f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a not found: ID does not exist" containerID="f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.151443 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a"} err="failed to get container status \"f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a\": rpc error: code = NotFound desc = could not find container \"f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a\": container with ID starting with f321a5b5711c742d2c9335f082716b7a364f071a6e0ed342cb01bfcdaf92884a not found: ID does not exist" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.208858 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.208898 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxpvl\" (UniqueName: \"kubernetes.io/projected/5bc856aa-a27b-4856-a888-7104df47cf30-kube-api-access-hxpvl\") on node \"crc\" DevicePath \"\"" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.254402 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bc856aa-a27b-4856-a888-7104df47cf30" (UID: "5bc856aa-a27b-4856-a888-7104df47cf30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.310212 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bc856aa-a27b-4856-a888-7104df47cf30-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.401668 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgrfs"] Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.411877 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vgrfs"] Jan 29 16:38:28 crc kubenswrapper[4886]: E0129 16:38:28.618978 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" Jan 29 16:38:28 crc kubenswrapper[4886]: I0129 16:38:28.627220 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" path="/var/lib/kubelet/pods/5bc856aa-a27b-4856-a888-7104df47cf30/volumes" Jan 29 16:38:29 crc kubenswrapper[4886]: I0129 16:38:29.662828 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:38:29 crc kubenswrapper[4886]: I0129 16:38:29.662929 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:38:29 crc kubenswrapper[4886]: I0129 16:38:29.662978 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:38:29 crc kubenswrapper[4886]: I0129 16:38:29.663618 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50ba5c9bdbddc145f7d20c044a7cd326eb16e00aa141bfc3e8c4f610ef31ae97"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:38:29 crc kubenswrapper[4886]: I0129 16:38:29.663665 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://50ba5c9bdbddc145f7d20c044a7cd326eb16e00aa141bfc3e8c4f610ef31ae97" gracePeriod=600 Jan 29 16:38:30 crc kubenswrapper[4886]: I0129 16:38:30.088120 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="50ba5c9bdbddc145f7d20c044a7cd326eb16e00aa141bfc3e8c4f610ef31ae97" exitCode=0 Jan 29 16:38:30 crc kubenswrapper[4886]: I0129 16:38:30.088193 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"50ba5c9bdbddc145f7d20c044a7cd326eb16e00aa141bfc3e8c4f610ef31ae97"} Jan 29 16:38:30 crc kubenswrapper[4886]: I0129 16:38:30.088432 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"84a645b31233e6f6691e7af3a8d18c33f1db7629388f3007d7e51e43f9f65e97"} Jan 29 16:38:30 crc kubenswrapper[4886]: I0129 16:38:30.088461 4886 scope.go:117] "RemoveContainer" containerID="773fe28c1c2f4b4e6b5a35ea611b7d8ab8f392d8f1b68bb09ec93e5c483b53ed" Jan 29 16:38:38 crc kubenswrapper[4886]: E0129 16:38:38.624215 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:38:42 crc kubenswrapper[4886]: I0129 16:38:42.204359 4886 generic.go:334] "Generic (PLEG): container finished" podID="69003a39-1c09-4087-a494-ebfd69e973cf" containerID="9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f" exitCode=0 Jan 29 16:38:42 crc kubenswrapper[4886]: I0129 16:38:42.204911 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfv6k" event={"ID":"69003a39-1c09-4087-a494-ebfd69e973cf","Type":"ContainerDied","Data":"9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f"} Jan 29 16:38:43 crc kubenswrapper[4886]: I0129 16:38:43.215757 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfv6k" event={"ID":"69003a39-1c09-4087-a494-ebfd69e973cf","Type":"ContainerStarted","Data":"735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef"} Jan 29 16:38:43 crc kubenswrapper[4886]: I0129 16:38:43.240050 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jfv6k" podStartSLOduration=2.266022337 podStartE2EDuration="10m39.240024748s" podCreationTimestamp="2026-01-29 16:28:04 +0000 UTC" firstStartedPulling="2026-01-29 16:28:05.688495947 +0000 UTC m=+368.597215219" lastFinishedPulling="2026-01-29 16:38:42.662498318 +0000 UTC m=+1005.571217630" observedRunningTime="2026-01-29 16:38:43.235379803 +0000 UTC m=+1006.144099085" watchObservedRunningTime="2026-01-29 16:38:43.240024748 +0000 UTC m=+1006.148744030" Jan 29 16:38:44 crc kubenswrapper[4886]: I0129 16:38:44.648711 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:38:44 crc kubenswrapper[4886]: I0129 16:38:44.648796 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:38:44 crc kubenswrapper[4886]: I0129 16:38:44.708790 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:38:49 crc kubenswrapper[4886]: E0129 16:38:49.621556 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" Jan 29 16:38:54 crc kubenswrapper[4886]: I0129 16:38:54.719254 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:39:02 crc kubenswrapper[4886]: I0129 16:39:02.351803 4886 generic.go:334] "Generic (PLEG): container finished" podID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerID="0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c" exitCode=0 Jan 29 16:39:02 crc kubenswrapper[4886]: I0129 16:39:02.351858 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkk68" event={"ID":"d84ce3e9-c41a-4a08-8d86-2a918d5e9450","Type":"ContainerDied","Data":"0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c"} Jan 29 16:39:03 crc kubenswrapper[4886]: I0129 16:39:03.362425 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkk68" event={"ID":"d84ce3e9-c41a-4a08-8d86-2a918d5e9450","Type":"ContainerStarted","Data":"29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990"} Jan 29 16:39:03 crc kubenswrapper[4886]: I0129 16:39:03.383754 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zkk68" podStartSLOduration=2.377020432 podStartE2EDuration="10m56.383732099s" podCreationTimestamp="2026-01-29 16:28:07 +0000 UTC" firstStartedPulling="2026-01-29 16:28:08.721120306 +0000 UTC m=+371.629839588" lastFinishedPulling="2026-01-29 16:39:02.727831983 +0000 UTC m=+1025.636551255" observedRunningTime="2026-01-29 16:39:03.382476232 +0000 UTC m=+1026.291195544" watchObservedRunningTime="2026-01-29 16:39:03.383732099 +0000 UTC m=+1026.292451371" Jan 29 16:39:07 crc kubenswrapper[4886]: I0129 16:39:07.583296 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:39:07 crc kubenswrapper[4886]: I0129 16:39:07.583762 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:39:08 crc kubenswrapper[4886]: I0129 16:39:08.635834 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="registry-server" probeResult="failure" output=< Jan 29 16:39:08 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 16:39:08 crc kubenswrapper[4886]: > Jan 29 16:39:17 crc kubenswrapper[4886]: I0129 16:39:17.653404 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:39:17 crc kubenswrapper[4886]: I0129 16:39:17.777872 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:40:59 crc kubenswrapper[4886]: I0129 16:40:59.661245 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:40:59 crc kubenswrapper[4886]: I0129 16:40:59.662519 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:41:29 crc kubenswrapper[4886]: I0129 16:41:29.661212 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:41:29 crc kubenswrapper[4886]: I0129 16:41:29.662023 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:41:59 crc kubenswrapper[4886]: I0129 16:41:59.661591 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:41:59 crc kubenswrapper[4886]: I0129 16:41:59.662171 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:41:59 crc kubenswrapper[4886]: I0129 16:41:59.662266 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:41:59 crc kubenswrapper[4886]: I0129 16:41:59.663023 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84a645b31233e6f6691e7af3a8d18c33f1db7629388f3007d7e51e43f9f65e97"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:41:59 crc kubenswrapper[4886]: I0129 16:41:59.663114 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://84a645b31233e6f6691e7af3a8d18c33f1db7629388f3007d7e51e43f9f65e97" gracePeriod=600 Jan 29 16:42:00 crc kubenswrapper[4886]: I0129 16:42:00.619604 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="84a645b31233e6f6691e7af3a8d18c33f1db7629388f3007d7e51e43f9f65e97" exitCode=0 Jan 29 16:42:00 crc kubenswrapper[4886]: I0129 16:42:00.626047 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"84a645b31233e6f6691e7af3a8d18c33f1db7629388f3007d7e51e43f9f65e97"} Jan 29 16:42:00 crc kubenswrapper[4886]: I0129 16:42:00.626121 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"e07342110c4b02787cb4723c63fa377397be4b574d1be34193ab1f7b4cebac54"} Jan 29 16:42:00 crc kubenswrapper[4886]: I0129 16:42:00.626152 4886 scope.go:117] "RemoveContainer" containerID="50ba5c9bdbddc145f7d20c044a7cd326eb16e00aa141bfc3e8c4f610ef31ae97" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.010147 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz"] Jan 29 16:42:10 crc kubenswrapper[4886]: E0129 16:42:10.011074 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" containerName="registry-server" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.011090 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" containerName="registry-server" Jan 29 16:42:10 crc kubenswrapper[4886]: E0129 16:42:10.011103 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" containerName="extract-utilities" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.011112 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" containerName="extract-utilities" Jan 29 16:42:10 crc kubenswrapper[4886]: E0129 16:42:10.011132 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerName="extract-utilities" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.011140 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerName="extract-utilities" Jan 29 16:42:10 crc kubenswrapper[4886]: E0129 16:42:10.011153 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerName="registry-server" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.011160 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerName="registry-server" Jan 29 16:42:10 crc kubenswrapper[4886]: E0129 16:42:10.011172 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerName="extract-content" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.011180 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerName="extract-content" Jan 29 16:42:10 crc kubenswrapper[4886]: E0129 16:42:10.011200 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" containerName="extract-content" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.011207 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" containerName="extract-content" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.011372 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc856aa-a27b-4856-a888-7104df47cf30" containerName="registry-server" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.011394 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="92af1116-2260-4c2f-a3b2-b3045d51065e" containerName="registry-server" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.012396 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.015103 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.023128 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz"] Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.160553 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.160656 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.160886 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxzmn\" (UniqueName: \"kubernetes.io/projected/20a67e3b-3393-4dea-81c8-42c2e22ad315-kube-api-access-lxzmn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.262130 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.262240 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxzmn\" (UniqueName: \"kubernetes.io/projected/20a67e3b-3393-4dea-81c8-42c2e22ad315-kube-api-access-lxzmn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.262463 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.262657 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.263185 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.289697 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxzmn\" (UniqueName: \"kubernetes.io/projected/20a67e3b-3393-4dea-81c8-42c2e22ad315-kube-api-access-lxzmn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.339517 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:10 crc kubenswrapper[4886]: I0129 16:42:10.778923 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz"] Jan 29 16:42:11 crc kubenswrapper[4886]: I0129 16:42:11.702505 4886 generic.go:334] "Generic (PLEG): container finished" podID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerID="5d883c5a30d8f4bbb039e6aaa651b8e09e6b2a8064244a25c33a761d3d8863ae" exitCode=0 Jan 29 16:42:11 crc kubenswrapper[4886]: I0129 16:42:11.702563 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" event={"ID":"20a67e3b-3393-4dea-81c8-42c2e22ad315","Type":"ContainerDied","Data":"5d883c5a30d8f4bbb039e6aaa651b8e09e6b2a8064244a25c33a761d3d8863ae"} Jan 29 16:42:11 crc kubenswrapper[4886]: I0129 16:42:11.702928 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" event={"ID":"20a67e3b-3393-4dea-81c8-42c2e22ad315","Type":"ContainerStarted","Data":"976f1abd45dc9a03c85afaf2f393d899a8fe7d61004333b35e039ff0d753b2d4"} Jan 29 16:42:11 crc kubenswrapper[4886]: I0129 16:42:11.706425 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 16:42:13 crc kubenswrapper[4886]: I0129 16:42:13.726605 4886 generic.go:334] "Generic (PLEG): container finished" podID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerID="33b121937df6965f1e7c4b97eec963e1caa986d708bab7e6baf54e700c6b9a38" exitCode=0 Jan 29 16:42:13 crc kubenswrapper[4886]: I0129 16:42:13.726678 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" event={"ID":"20a67e3b-3393-4dea-81c8-42c2e22ad315","Type":"ContainerDied","Data":"33b121937df6965f1e7c4b97eec963e1caa986d708bab7e6baf54e700c6b9a38"} Jan 29 16:42:14 crc kubenswrapper[4886]: I0129 16:42:14.736230 4886 generic.go:334] "Generic (PLEG): container finished" podID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerID="f97710e37d132101bc18cdd88c6b7f51c7d65099d23a9fcf1887c1bba9f84a3e" exitCode=0 Jan 29 16:42:14 crc kubenswrapper[4886]: I0129 16:42:14.736359 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" event={"ID":"20a67e3b-3393-4dea-81c8-42c2e22ad315","Type":"ContainerDied","Data":"f97710e37d132101bc18cdd88c6b7f51c7d65099d23a9fcf1887c1bba9f84a3e"} Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.015190 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.156437 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-bundle\") pod \"20a67e3b-3393-4dea-81c8-42c2e22ad315\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.156486 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxzmn\" (UniqueName: \"kubernetes.io/projected/20a67e3b-3393-4dea-81c8-42c2e22ad315-kube-api-access-lxzmn\") pod \"20a67e3b-3393-4dea-81c8-42c2e22ad315\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.156518 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-util\") pod \"20a67e3b-3393-4dea-81c8-42c2e22ad315\" (UID: \"20a67e3b-3393-4dea-81c8-42c2e22ad315\") " Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.162313 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-bundle" (OuterVolumeSpecName: "bundle") pod "20a67e3b-3393-4dea-81c8-42c2e22ad315" (UID: "20a67e3b-3393-4dea-81c8-42c2e22ad315"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.169219 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a67e3b-3393-4dea-81c8-42c2e22ad315-kube-api-access-lxzmn" (OuterVolumeSpecName: "kube-api-access-lxzmn") pod "20a67e3b-3393-4dea-81c8-42c2e22ad315" (UID: "20a67e3b-3393-4dea-81c8-42c2e22ad315"). InnerVolumeSpecName "kube-api-access-lxzmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.187029 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-util" (OuterVolumeSpecName: "util") pod "20a67e3b-3393-4dea-81c8-42c2e22ad315" (UID: "20a67e3b-3393-4dea-81c8-42c2e22ad315"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.258293 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.258353 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxzmn\" (UniqueName: \"kubernetes.io/projected/20a67e3b-3393-4dea-81c8-42c2e22ad315-kube-api-access-lxzmn\") on node \"crc\" DevicePath \"\"" Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.258366 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20a67e3b-3393-4dea-81c8-42c2e22ad315-util\") on node \"crc\" DevicePath \"\"" Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.751924 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" event={"ID":"20a67e3b-3393-4dea-81c8-42c2e22ad315","Type":"ContainerDied","Data":"976f1abd45dc9a03c85afaf2f393d899a8fe7d61004333b35e039ff0d753b2d4"} Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.751981 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976f1abd45dc9a03c85afaf2f393d899a8fe7d61004333b35e039ff0d753b2d4" Jan 29 16:42:16 crc kubenswrapper[4886]: I0129 16:42:16.751989 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.921683 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z"] Jan 29 16:42:28 crc kubenswrapper[4886]: E0129 16:42:28.922504 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerName="util" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.922522 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerName="util" Jan 29 16:42:28 crc kubenswrapper[4886]: E0129 16:42:28.922540 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerName="pull" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.922549 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerName="pull" Jan 29 16:42:28 crc kubenswrapper[4886]: E0129 16:42:28.922569 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerName="extract" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.922578 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerName="extract" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.922702 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a67e3b-3393-4dea-81c8-42c2e22ad315" containerName="extract" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.923206 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.934264 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.934914 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.934981 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-87x2p" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.940887 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z"] Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.979123 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5"] Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.990152 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9"] Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.990840 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.991151 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" Jan 29 16:42:28 crc kubenswrapper[4886]: I0129 16:42:28.995152 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.000428 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5"] Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.001142 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-kqkdx" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.006045 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9"] Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.036039 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxpmf\" (UniqueName: \"kubernetes.io/projected/1151b336-be43-4e43-959d-463c956e9bc4-kube-api-access-pxpmf\") pod \"obo-prometheus-operator-68bc856cb9-72k5z\" (UID: \"1151b336-be43-4e43-959d-463c956e9bc4\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.137457 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2e7310d-6390-4a0d-b0bd-f8467c80517c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9\" (UID: \"e2e7310d-6390-4a0d-b0bd-f8467c80517c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.137517 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1472730-ce1e-4333-a6c6-930196b9d257-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5\" (UID: \"e1472730-ce1e-4333-a6c6-930196b9d257\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.137574 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxpmf\" (UniqueName: \"kubernetes.io/projected/1151b336-be43-4e43-959d-463c956e9bc4-kube-api-access-pxpmf\") pod \"obo-prometheus-operator-68bc856cb9-72k5z\" (UID: \"1151b336-be43-4e43-959d-463c956e9bc4\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.137589 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1472730-ce1e-4333-a6c6-930196b9d257-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5\" (UID: \"e1472730-ce1e-4333-a6c6-930196b9d257\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.137611 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2e7310d-6390-4a0d-b0bd-f8467c80517c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9\" (UID: \"e2e7310d-6390-4a0d-b0bd-f8467c80517c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.170472 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxpmf\" (UniqueName: \"kubernetes.io/projected/1151b336-be43-4e43-959d-463c956e9bc4-kube-api-access-pxpmf\") pod \"obo-prometheus-operator-68bc856cb9-72k5z\" (UID: \"1151b336-be43-4e43-959d-463c956e9bc4\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.173707 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w5qml"] Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.174493 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.176852 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.179849 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-qx7cn" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.211532 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w5qml"] Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.238704 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1472730-ce1e-4333-a6c6-930196b9d257-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5\" (UID: \"e1472730-ce1e-4333-a6c6-930196b9d257\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.238751 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2e7310d-6390-4a0d-b0bd-f8467c80517c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9\" (UID: \"e2e7310d-6390-4a0d-b0bd-f8467c80517c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.238802 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2e7310d-6390-4a0d-b0bd-f8467c80517c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9\" (UID: \"e2e7310d-6390-4a0d-b0bd-f8467c80517c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.238831 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1472730-ce1e-4333-a6c6-930196b9d257-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5\" (UID: \"e1472730-ce1e-4333-a6c6-930196b9d257\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.241585 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.242792 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1472730-ce1e-4333-a6c6-930196b9d257-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5\" (UID: \"e1472730-ce1e-4333-a6c6-930196b9d257\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.242944 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1472730-ce1e-4333-a6c6-930196b9d257-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5\" (UID: \"e1472730-ce1e-4333-a6c6-930196b9d257\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.243932 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2e7310d-6390-4a0d-b0bd-f8467c80517c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9\" (UID: \"e2e7310d-6390-4a0d-b0bd-f8467c80517c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.259783 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2e7310d-6390-4a0d-b0bd-f8467c80517c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9\" (UID: \"e2e7310d-6390-4a0d-b0bd-f8467c80517c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.312767 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.324465 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.343500 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59nxm\" (UniqueName: \"kubernetes.io/projected/17549a68-0567-40f8-9dda-37cd61f71b94-kube-api-access-59nxm\") pod \"observability-operator-59bdc8b94-w5qml\" (UID: \"17549a68-0567-40f8-9dda-37cd61f71b94\") " pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.343672 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/17549a68-0567-40f8-9dda-37cd61f71b94-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w5qml\" (UID: \"17549a68-0567-40f8-9dda-37cd61f71b94\") " pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.352137 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dtcpm"] Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.352976 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.366394 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-pmhdg" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.375225 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dtcpm"] Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.445380 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/17549a68-0567-40f8-9dda-37cd61f71b94-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w5qml\" (UID: \"17549a68-0567-40f8-9dda-37cd61f71b94\") " pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.445445 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59nxm\" (UniqueName: \"kubernetes.io/projected/17549a68-0567-40f8-9dda-37cd61f71b94-kube-api-access-59nxm\") pod \"observability-operator-59bdc8b94-w5qml\" (UID: \"17549a68-0567-40f8-9dda-37cd61f71b94\") " pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.453755 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/17549a68-0567-40f8-9dda-37cd61f71b94-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w5qml\" (UID: \"17549a68-0567-40f8-9dda-37cd61f71b94\") " pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.467610 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59nxm\" (UniqueName: \"kubernetes.io/projected/17549a68-0567-40f8-9dda-37cd61f71b94-kube-api-access-59nxm\") pod \"observability-operator-59bdc8b94-w5qml\" (UID: \"17549a68-0567-40f8-9dda-37cd61f71b94\") " pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.504833 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.547743 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58sl\" (UniqueName: \"kubernetes.io/projected/d2a26d31-689d-4052-9df2-1654feb68c2d-kube-api-access-q58sl\") pod \"perses-operator-5bf474d74f-dtcpm\" (UID: \"d2a26d31-689d-4052-9df2-1654feb68c2d\") " pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.547864 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2a26d31-689d-4052-9df2-1654feb68c2d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dtcpm\" (UID: \"d2a26d31-689d-4052-9df2-1654feb68c2d\") " pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.599406 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z"] Jan 29 16:42:29 crc kubenswrapper[4886]: W0129 16:42:29.622654 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1151b336_be43_4e43_959d_463c956e9bc4.slice/crio-d59ae2b4bf608b7fa1b68d986522ef7dbaaad1a9d834a7636f0a9fc4f8df6c56 WatchSource:0}: Error finding container d59ae2b4bf608b7fa1b68d986522ef7dbaaad1a9d834a7636f0a9fc4f8df6c56: Status 404 returned error can't find the container with id d59ae2b4bf608b7fa1b68d986522ef7dbaaad1a9d834a7636f0a9fc4f8df6c56 Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.649719 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q58sl\" (UniqueName: \"kubernetes.io/projected/d2a26d31-689d-4052-9df2-1654feb68c2d-kube-api-access-q58sl\") pod \"perses-operator-5bf474d74f-dtcpm\" (UID: \"d2a26d31-689d-4052-9df2-1654feb68c2d\") " pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.649821 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2a26d31-689d-4052-9df2-1654feb68c2d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dtcpm\" (UID: \"d2a26d31-689d-4052-9df2-1654feb68c2d\") " pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.651137 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2a26d31-689d-4052-9df2-1654feb68c2d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dtcpm\" (UID: \"d2a26d31-689d-4052-9df2-1654feb68c2d\") " pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.660086 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9"] Jan 29 16:42:29 crc kubenswrapper[4886]: W0129 16:42:29.665653 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e7310d_6390_4a0d_b0bd_f8467c80517c.slice/crio-c3a2aed72fa7cf38cac1b034388741047a39f6194ebba489f12e0f20f05d7e1a WatchSource:0}: Error finding container c3a2aed72fa7cf38cac1b034388741047a39f6194ebba489f12e0f20f05d7e1a: Status 404 returned error can't find the container with id c3a2aed72fa7cf38cac1b034388741047a39f6194ebba489f12e0f20f05d7e1a Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.676065 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q58sl\" (UniqueName: \"kubernetes.io/projected/d2a26d31-689d-4052-9df2-1654feb68c2d-kube-api-access-q58sl\") pod \"perses-operator-5bf474d74f-dtcpm\" (UID: \"d2a26d31-689d-4052-9df2-1654feb68c2d\") " pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.679067 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.701478 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5"] Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.826570 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" event={"ID":"e2e7310d-6390-4a0d-b0bd-f8467c80517c","Type":"ContainerStarted","Data":"c3a2aed72fa7cf38cac1b034388741047a39f6194ebba489f12e0f20f05d7e1a"} Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.827314 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" event={"ID":"e1472730-ce1e-4333-a6c6-930196b9d257","Type":"ContainerStarted","Data":"c8873af3ed6924f4ee99c1c3a5b3b1fe51732f9684c5fb5f30fd703ee439948d"} Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.828227 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z" event={"ID":"1151b336-be43-4e43-959d-463c956e9bc4","Type":"ContainerStarted","Data":"d59ae2b4bf608b7fa1b68d986522ef7dbaaad1a9d834a7636f0a9fc4f8df6c56"} Jan 29 16:42:29 crc kubenswrapper[4886]: I0129 16:42:29.938674 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dtcpm"] Jan 29 16:42:29 crc kubenswrapper[4886]: W0129 16:42:29.941757 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2a26d31_689d_4052_9df2_1654feb68c2d.slice/crio-28badb714870bb63b729974e5f3d38902243caabbf53e05f6a009feeb7a0b316 WatchSource:0}: Error finding container 28badb714870bb63b729974e5f3d38902243caabbf53e05f6a009feeb7a0b316: Status 404 returned error can't find the container with id 28badb714870bb63b729974e5f3d38902243caabbf53e05f6a009feeb7a0b316 Jan 29 16:42:30 crc kubenswrapper[4886]: I0129 16:42:30.006304 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w5qml"] Jan 29 16:42:30 crc kubenswrapper[4886]: I0129 16:42:30.834856 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-w5qml" event={"ID":"17549a68-0567-40f8-9dda-37cd61f71b94","Type":"ContainerStarted","Data":"f28a277f08071599754e25d38da40d646af2c0915c4cc3ecfb76416f18ac3e77"} Jan 29 16:42:30 crc kubenswrapper[4886]: I0129 16:42:30.836231 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" event={"ID":"d2a26d31-689d-4052-9df2-1654feb68c2d","Type":"ContainerStarted","Data":"28badb714870bb63b729974e5f3d38902243caabbf53e05f6a009feeb7a0b316"} Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.923175 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-w5qml" event={"ID":"17549a68-0567-40f8-9dda-37cd61f71b94","Type":"ContainerStarted","Data":"5f251584cd4a72392bf82fcbaac03e86f9d34fedf3e60b93b7d5cf1e7fb50a29"} Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.923833 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.925459 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-w5qml" Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.925486 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" event={"ID":"d2a26d31-689d-4052-9df2-1654feb68c2d","Type":"ContainerStarted","Data":"9992d8e0634ed981ff9fd7bc0427ba554332b02075e523f4c92a45ceda3b6d32"} Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.925631 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.926783 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z" event={"ID":"1151b336-be43-4e43-959d-463c956e9bc4","Type":"ContainerStarted","Data":"a9babc6a5fe0ba78e4b2020f1e2034d16a2615aa4af5bb2a69984dd3ca27c70b"} Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.928148 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" event={"ID":"e2e7310d-6390-4a0d-b0bd-f8467c80517c","Type":"ContainerStarted","Data":"807ec1adae81c9e16b2e9afbb7d38b63e30bf1a658ddb7cef971234e3f58eeaa"} Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.929729 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" event={"ID":"e1472730-ce1e-4333-a6c6-930196b9d257","Type":"ContainerStarted","Data":"9de3db81fb223377845988b8fe8a70e61eee46c753b8d2742e232ec42c7c4d5c"} Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.949517 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-w5qml" podStartSLOduration=1.778014539 podStartE2EDuration="15.949504031s" podCreationTimestamp="2026-01-29 16:42:29 +0000 UTC" firstStartedPulling="2026-01-29 16:42:30.007827503 +0000 UTC m=+1232.916546775" lastFinishedPulling="2026-01-29 16:42:44.179316995 +0000 UTC m=+1247.088036267" observedRunningTime="2026-01-29 16:42:44.947175275 +0000 UTC m=+1247.855894557" watchObservedRunningTime="2026-01-29 16:42:44.949504031 +0000 UTC m=+1247.858223303" Jan 29 16:42:44 crc kubenswrapper[4886]: I0129 16:42:44.978806 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9" podStartSLOduration=2.579788996 podStartE2EDuration="16.9787837s" podCreationTimestamp="2026-01-29 16:42:28 +0000 UTC" firstStartedPulling="2026-01-29 16:42:29.679085645 +0000 UTC m=+1232.587804927" lastFinishedPulling="2026-01-29 16:42:44.078080359 +0000 UTC m=+1246.986799631" observedRunningTime="2026-01-29 16:42:44.973764127 +0000 UTC m=+1247.882483409" watchObservedRunningTime="2026-01-29 16:42:44.9787837 +0000 UTC m=+1247.887502972" Jan 29 16:42:45 crc kubenswrapper[4886]: I0129 16:42:45.010748 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5" podStartSLOduration=2.673875029 podStartE2EDuration="17.010734304s" podCreationTimestamp="2026-01-29 16:42:28 +0000 UTC" firstStartedPulling="2026-01-29 16:42:29.739201867 +0000 UTC m=+1232.647921139" lastFinishedPulling="2026-01-29 16:42:44.076061142 +0000 UTC m=+1246.984780414" observedRunningTime="2026-01-29 16:42:45.009444258 +0000 UTC m=+1247.918163540" watchObservedRunningTime="2026-01-29 16:42:45.010734304 +0000 UTC m=+1247.919453576" Jan 29 16:42:45 crc kubenswrapper[4886]: I0129 16:42:45.036135 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" podStartSLOduration=1.909028948 podStartE2EDuration="16.036115493s" podCreationTimestamp="2026-01-29 16:42:29 +0000 UTC" firstStartedPulling="2026-01-29 16:42:29.943985665 +0000 UTC m=+1232.852704927" lastFinishedPulling="2026-01-29 16:42:44.0710722 +0000 UTC m=+1246.979791472" observedRunningTime="2026-01-29 16:42:45.034874408 +0000 UTC m=+1247.943593680" watchObservedRunningTime="2026-01-29 16:42:45.036115493 +0000 UTC m=+1247.944834765" Jan 29 16:42:45 crc kubenswrapper[4886]: I0129 16:42:45.050019 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-72k5z" podStartSLOduration=2.602313425 podStartE2EDuration="17.049994886s" podCreationTimestamp="2026-01-29 16:42:28 +0000 UTC" firstStartedPulling="2026-01-29 16:42:29.629085 +0000 UTC m=+1232.537804272" lastFinishedPulling="2026-01-29 16:42:44.076766471 +0000 UTC m=+1246.985485733" observedRunningTime="2026-01-29 16:42:45.047133755 +0000 UTC m=+1247.955853027" watchObservedRunningTime="2026-01-29 16:42:45.049994886 +0000 UTC m=+1247.958714158" Jan 29 16:42:49 crc kubenswrapper[4886]: I0129 16:42:49.681703 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-dtcpm" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.672652 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-bqffj"] Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.673793 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bqffj" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.676081 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.676242 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.676313 4886 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jlgkl" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.697747 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x5t7\" (UniqueName: \"kubernetes.io/projected/f883321e-6f99-4c0d-89ea-377fec9d166c-kube-api-access-9x5t7\") pod \"cert-manager-cainjector-cf98fcc89-bqffj\" (UID: \"f883321e-6f99-4c0d-89ea-377fec9d166c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-bqffj" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.706383 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-bqffj"] Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.733527 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-n8tt2"] Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.734377 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-n8tt2" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.736272 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-sd87l"] Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.737352 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.737609 4886 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-fl6zk" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.743223 4886 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-sqmqv" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.743385 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-n8tt2"] Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.746377 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-sd87l"] Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.799029 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpdsz\" (UniqueName: \"kubernetes.io/projected/a80a9fce-17df-45c6-b123-f3060469c1c9-kube-api-access-mpdsz\") pod \"cert-manager-webhook-687f57d79b-sd87l\" (UID: \"a80a9fce-17df-45c6-b123-f3060469c1c9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.799113 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x5t7\" (UniqueName: \"kubernetes.io/projected/f883321e-6f99-4c0d-89ea-377fec9d166c-kube-api-access-9x5t7\") pod \"cert-manager-cainjector-cf98fcc89-bqffj\" (UID: \"f883321e-6f99-4c0d-89ea-377fec9d166c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-bqffj" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.799142 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbkh\" (UniqueName: \"kubernetes.io/projected/0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a-kube-api-access-lcbkh\") pod \"cert-manager-858654f9db-n8tt2\" (UID: \"0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a\") " pod="cert-manager/cert-manager-858654f9db-n8tt2" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.832370 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x5t7\" (UniqueName: \"kubernetes.io/projected/f883321e-6f99-4c0d-89ea-377fec9d166c-kube-api-access-9x5t7\") pod \"cert-manager-cainjector-cf98fcc89-bqffj\" (UID: \"f883321e-6f99-4c0d-89ea-377fec9d166c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-bqffj" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.900880 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpdsz\" (UniqueName: \"kubernetes.io/projected/a80a9fce-17df-45c6-b123-f3060469c1c9-kube-api-access-mpdsz\") pod \"cert-manager-webhook-687f57d79b-sd87l\" (UID: \"a80a9fce-17df-45c6-b123-f3060469c1c9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.900995 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbkh\" (UniqueName: \"kubernetes.io/projected/0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a-kube-api-access-lcbkh\") pod \"cert-manager-858654f9db-n8tt2\" (UID: \"0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a\") " pod="cert-manager/cert-manager-858654f9db-n8tt2" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.927062 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbkh\" (UniqueName: \"kubernetes.io/projected/0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a-kube-api-access-lcbkh\") pod \"cert-manager-858654f9db-n8tt2\" (UID: \"0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a\") " pod="cert-manager/cert-manager-858654f9db-n8tt2" Jan 29 16:42:53 crc kubenswrapper[4886]: I0129 16:42:53.929695 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpdsz\" (UniqueName: \"kubernetes.io/projected/a80a9fce-17df-45c6-b123-f3060469c1c9-kube-api-access-mpdsz\") pod \"cert-manager-webhook-687f57d79b-sd87l\" (UID: \"a80a9fce-17df-45c6-b123-f3060469c1c9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.009301 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bqffj" Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.098265 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-n8tt2" Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.104154 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.599179 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-bqffj"] Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.633690 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-n8tt2"] Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.752372 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-sd87l"] Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.986290 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" event={"ID":"a80a9fce-17df-45c6-b123-f3060469c1c9","Type":"ContainerStarted","Data":"d76cc09d39fd1489a0a6731b4db02244e2b953b627c2a1da89d75a187a77d4fa"} Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.987221 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-n8tt2" event={"ID":"0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a","Type":"ContainerStarted","Data":"6e6db538cd0773e16d22299a597c45cac1c79850c2689aceea23c8c1d44a2acb"} Jan 29 16:42:54 crc kubenswrapper[4886]: I0129 16:42:54.988119 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bqffj" event={"ID":"f883321e-6f99-4c0d-89ea-377fec9d166c","Type":"ContainerStarted","Data":"aac746d82eefc1fab729f2d22b3db755db7701292ab01dff94bf3a35158d7548"} Jan 29 16:43:00 crc kubenswrapper[4886]: I0129 16:43:00.027297 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bqffj" event={"ID":"f883321e-6f99-4c0d-89ea-377fec9d166c","Type":"ContainerStarted","Data":"a4f3b16bd260748325fa52011e2e544b805ef52770eb12e956d54a4637e53c9c"} Jan 29 16:43:00 crc kubenswrapper[4886]: I0129 16:43:00.028651 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" event={"ID":"a80a9fce-17df-45c6-b123-f3060469c1c9","Type":"ContainerStarted","Data":"4ae4f5de49ab8e404f36f0965082d3290ed77dedd5ab75141d3d59441c428d17"} Jan 29 16:43:00 crc kubenswrapper[4886]: I0129 16:43:00.028852 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" Jan 29 16:43:00 crc kubenswrapper[4886]: I0129 16:43:00.029792 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-n8tt2" event={"ID":"0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a","Type":"ContainerStarted","Data":"aac75e2cb7cfa8356fdcb6a853568d59ab7efaf19a045c2f0d1b28d5aeac4a61"} Jan 29 16:43:00 crc kubenswrapper[4886]: I0129 16:43:00.039938 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bqffj" podStartSLOduration=2.819381327 podStartE2EDuration="7.039916779s" podCreationTimestamp="2026-01-29 16:42:53 +0000 UTC" firstStartedPulling="2026-01-29 16:42:54.610755139 +0000 UTC m=+1257.519474401" lastFinishedPulling="2026-01-29 16:42:58.831290581 +0000 UTC m=+1261.740009853" observedRunningTime="2026-01-29 16:43:00.039166627 +0000 UTC m=+1262.947885899" watchObservedRunningTime="2026-01-29 16:43:00.039916779 +0000 UTC m=+1262.948636081" Jan 29 16:43:00 crc kubenswrapper[4886]: I0129 16:43:00.055791 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-n8tt2" podStartSLOduration=2.8109913300000002 podStartE2EDuration="7.055769608s" podCreationTimestamp="2026-01-29 16:42:53 +0000 UTC" firstStartedPulling="2026-01-29 16:42:54.652903182 +0000 UTC m=+1257.561622454" lastFinishedPulling="2026-01-29 16:42:58.89768144 +0000 UTC m=+1261.806400732" observedRunningTime="2026-01-29 16:43:00.051466756 +0000 UTC m=+1262.960186088" watchObservedRunningTime="2026-01-29 16:43:00.055769608 +0000 UTC m=+1262.964488880" Jan 29 16:43:00 crc kubenswrapper[4886]: I0129 16:43:00.087949 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" podStartSLOduration=2.93782224 podStartE2EDuration="7.087928198s" podCreationTimestamp="2026-01-29 16:42:53 +0000 UTC" firstStartedPulling="2026-01-29 16:42:54.75595134 +0000 UTC m=+1257.664670612" lastFinishedPulling="2026-01-29 16:42:58.906057288 +0000 UTC m=+1261.814776570" observedRunningTime="2026-01-29 16:43:00.087215388 +0000 UTC m=+1262.995934670" watchObservedRunningTime="2026-01-29 16:43:00.087928198 +0000 UTC m=+1262.996647470" Jan 29 16:43:04 crc kubenswrapper[4886]: I0129 16:43:04.108355 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-sd87l" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.499224 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t"] Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.504480 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.506073 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t"] Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.516143 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.600110 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.600206 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.600237 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntc9w\" (UniqueName: \"kubernetes.io/projected/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-kube-api-access-ntc9w\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.701938 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.702115 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.702151 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntc9w\" (UniqueName: \"kubernetes.io/projected/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-kube-api-access-ntc9w\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.702425 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.702968 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.724934 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntc9w\" (UniqueName: \"kubernetes.io/projected/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-kube-api-access-ntc9w\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.825406 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.864650 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n"] Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.866183 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.896754 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n"] Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.904688 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.904821 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqsfs\" (UniqueName: \"kubernetes.io/projected/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-kube-api-access-rqsfs\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:31 crc kubenswrapper[4886]: I0129 16:43:31.904879 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.006430 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqsfs\" (UniqueName: \"kubernetes.io/projected/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-kube-api-access-rqsfs\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.006510 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.006564 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.007073 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.007318 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.033428 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqsfs\" (UniqueName: \"kubernetes.io/projected/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-kube-api-access-rqsfs\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.064001 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t"] Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.196920 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.272451 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" event={"ID":"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a","Type":"ContainerStarted","Data":"3b3d7653af10af1be662575ec81d5964f016b37d552180b0fffc7f334ee3e715"} Jan 29 16:43:32 crc kubenswrapper[4886]: I0129 16:43:32.423072 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n"] Jan 29 16:43:32 crc kubenswrapper[4886]: W0129 16:43:32.425835 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00b2947_6947_4d0a_b2d9_42adefd8ebb3.slice/crio-1d29ef1c12997096e36892fdf75d3f7775d972c0d8c2b7af17235ce3ab3f5ad1 WatchSource:0}: Error finding container 1d29ef1c12997096e36892fdf75d3f7775d972c0d8c2b7af17235ce3ab3f5ad1: Status 404 returned error can't find the container with id 1d29ef1c12997096e36892fdf75d3f7775d972c0d8c2b7af17235ce3ab3f5ad1 Jan 29 16:43:33 crc kubenswrapper[4886]: I0129 16:43:33.281280 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" event={"ID":"b00b2947-6947-4d0a-b2d9-42adefd8ebb3","Type":"ContainerStarted","Data":"1d29ef1c12997096e36892fdf75d3f7775d972c0d8c2b7af17235ce3ab3f5ad1"} Jan 29 16:43:34 crc kubenswrapper[4886]: I0129 16:43:34.291076 4886 generic.go:334] "Generic (PLEG): container finished" podID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerID="e4cccb4d486fe60f0edfb4f7f715ab8d92c12f9f9f4a1cfe4e00c4adc5c34b51" exitCode=0 Jan 29 16:43:34 crc kubenswrapper[4886]: I0129 16:43:34.291208 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" event={"ID":"b00b2947-6947-4d0a-b2d9-42adefd8ebb3","Type":"ContainerDied","Data":"e4cccb4d486fe60f0edfb4f7f715ab8d92c12f9f9f4a1cfe4e00c4adc5c34b51"} Jan 29 16:43:34 crc kubenswrapper[4886]: I0129 16:43:34.296184 4886 generic.go:334] "Generic (PLEG): container finished" podID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerID="a37b6266b19c1ce3a441dff00e8cafa9669109c4ad6f2385f4502687f4af460a" exitCode=0 Jan 29 16:43:34 crc kubenswrapper[4886]: I0129 16:43:34.296253 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" event={"ID":"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a","Type":"ContainerDied","Data":"a37b6266b19c1ce3a441dff00e8cafa9669109c4ad6f2385f4502687f4af460a"} Jan 29 16:43:38 crc kubenswrapper[4886]: I0129 16:43:38.338640 4886 generic.go:334] "Generic (PLEG): container finished" podID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerID="82c9ec7fc7823b99a453ab6558f3f2d190f9fc013e02e7613db77aca6c9d421f" exitCode=0 Jan 29 16:43:38 crc kubenswrapper[4886]: I0129 16:43:38.338751 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" event={"ID":"b00b2947-6947-4d0a-b2d9-42adefd8ebb3","Type":"ContainerDied","Data":"82c9ec7fc7823b99a453ab6558f3f2d190f9fc013e02e7613db77aca6c9d421f"} Jan 29 16:43:38 crc kubenswrapper[4886]: I0129 16:43:38.343236 4886 generic.go:334] "Generic (PLEG): container finished" podID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerID="0e60e37f19cf29954ac9598d39f3e907b0a8fd7df0f8e5321feafa568cea256e" exitCode=0 Jan 29 16:43:38 crc kubenswrapper[4886]: I0129 16:43:38.343297 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" event={"ID":"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a","Type":"ContainerDied","Data":"0e60e37f19cf29954ac9598d39f3e907b0a8fd7df0f8e5321feafa568cea256e"} Jan 29 16:43:39 crc kubenswrapper[4886]: I0129 16:43:39.351752 4886 generic.go:334] "Generic (PLEG): container finished" podID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerID="c3183e31247098ddd97f7b27ad0dbf70d02daf691b6fbd6a4595181aba6a0ae9" exitCode=0 Jan 29 16:43:39 crc kubenswrapper[4886]: I0129 16:43:39.351844 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" event={"ID":"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a","Type":"ContainerDied","Data":"c3183e31247098ddd97f7b27ad0dbf70d02daf691b6fbd6a4595181aba6a0ae9"} Jan 29 16:43:39 crc kubenswrapper[4886]: I0129 16:43:39.354642 4886 generic.go:334] "Generic (PLEG): container finished" podID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerID="8d122cad021ce2744d255a9dc7ff90dfde7fd82fdce7705c91c1c86d943ebbab" exitCode=0 Jan 29 16:43:39 crc kubenswrapper[4886]: I0129 16:43:39.354682 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" event={"ID":"b00b2947-6947-4d0a-b2d9-42adefd8ebb3","Type":"ContainerDied","Data":"8d122cad021ce2744d255a9dc7ff90dfde7fd82fdce7705c91c1c86d943ebbab"} Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.643219 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.648142 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.837656 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntc9w\" (UniqueName: \"kubernetes.io/projected/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-kube-api-access-ntc9w\") pod \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.837757 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-util\") pod \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.837779 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-bundle\") pod \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.837808 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-bundle\") pod \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\" (UID: \"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a\") " Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.837826 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-util\") pod \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.837926 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqsfs\" (UniqueName: \"kubernetes.io/projected/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-kube-api-access-rqsfs\") pod \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\" (UID: \"b00b2947-6947-4d0a-b2d9-42adefd8ebb3\") " Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.838581 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-bundle" (OuterVolumeSpecName: "bundle") pod "e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" (UID: "e6c5874b-97c3-4f3e-8e88-68c3653a6c4a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.838595 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-bundle" (OuterVolumeSpecName: "bundle") pod "b00b2947-6947-4d0a-b2d9-42adefd8ebb3" (UID: "b00b2947-6947-4d0a-b2d9-42adefd8ebb3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.844631 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-kube-api-access-ntc9w" (OuterVolumeSpecName: "kube-api-access-ntc9w") pod "e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" (UID: "e6c5874b-97c3-4f3e-8e88-68c3653a6c4a"). InnerVolumeSpecName "kube-api-access-ntc9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.845560 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-kube-api-access-rqsfs" (OuterVolumeSpecName: "kube-api-access-rqsfs") pod "b00b2947-6947-4d0a-b2d9-42adefd8ebb3" (UID: "b00b2947-6947-4d0a-b2d9-42adefd8ebb3"). InnerVolumeSpecName "kube-api-access-rqsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.847910 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-util" (OuterVolumeSpecName: "util") pod "e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" (UID: "e6c5874b-97c3-4f3e-8e88-68c3653a6c4a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.862062 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-util" (OuterVolumeSpecName: "util") pod "b00b2947-6947-4d0a-b2d9-42adefd8ebb3" (UID: "b00b2947-6947-4d0a-b2d9-42adefd8ebb3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.939861 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqsfs\" (UniqueName: \"kubernetes.io/projected/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-kube-api-access-rqsfs\") on node \"crc\" DevicePath \"\"" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.939904 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntc9w\" (UniqueName: \"kubernetes.io/projected/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-kube-api-access-ntc9w\") on node \"crc\" DevicePath \"\"" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.939916 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-util\") on node \"crc\" DevicePath \"\"" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.939928 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.939941 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:43:40 crc kubenswrapper[4886]: I0129 16:43:40.939951 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00b2947-6947-4d0a-b2d9-42adefd8ebb3-util\") on node \"crc\" DevicePath \"\"" Jan 29 16:43:41 crc kubenswrapper[4886]: I0129 16:43:41.369029 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" event={"ID":"b00b2947-6947-4d0a-b2d9-42adefd8ebb3","Type":"ContainerDied","Data":"1d29ef1c12997096e36892fdf75d3f7775d972c0d8c2b7af17235ce3ab3f5ad1"} Jan 29 16:43:41 crc kubenswrapper[4886]: I0129 16:43:41.369066 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d29ef1c12997096e36892fdf75d3f7775d972c0d8c2b7af17235ce3ab3f5ad1" Jan 29 16:43:41 crc kubenswrapper[4886]: I0129 16:43:41.369277 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n" Jan 29 16:43:41 crc kubenswrapper[4886]: I0129 16:43:41.371407 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" event={"ID":"e6c5874b-97c3-4f3e-8e88-68c3653a6c4a","Type":"ContainerDied","Data":"3b3d7653af10af1be662575ec81d5964f016b37d552180b0fffc7f334ee3e715"} Jan 29 16:43:41 crc kubenswrapper[4886]: I0129 16:43:41.371431 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b3d7653af10af1be662575ec81d5964f016b37d552180b0fffc7f334ee3e715" Jan 29 16:43:41 crc kubenswrapper[4886]: I0129 16:43:41.371514 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.282812 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw"] Jan 29 16:43:48 crc kubenswrapper[4886]: E0129 16:43:48.283683 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerName="extract" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.283699 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerName="extract" Jan 29 16:43:48 crc kubenswrapper[4886]: E0129 16:43:48.283721 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerName="extract" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.283729 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerName="extract" Jan 29 16:43:48 crc kubenswrapper[4886]: E0129 16:43:48.283743 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerName="util" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.283751 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerName="util" Jan 29 16:43:48 crc kubenswrapper[4886]: E0129 16:43:48.283765 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerName="pull" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.283773 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerName="pull" Jan 29 16:43:48 crc kubenswrapper[4886]: E0129 16:43:48.283790 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerName="pull" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.283798 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerName="pull" Jan 29 16:43:48 crc kubenswrapper[4886]: E0129 16:43:48.283807 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerName="util" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.283814 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerName="util" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.283959 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" containerName="extract" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.283977 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" containerName="extract" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.284843 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.287769 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.288058 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.288590 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.288808 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.289168 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.289535 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-tvwsb" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.307748 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw"] Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.449548 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-manager-config\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.449883 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz5vt\" (UniqueName: \"kubernetes.io/projected/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-kube-api-access-vz5vt\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.449923 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-webhook-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.449955 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.449993 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-apiservice-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.551111 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-webhook-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.551186 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.551245 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-apiservice-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.551289 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-manager-config\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.551342 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz5vt\" (UniqueName: \"kubernetes.io/projected/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-kube-api-access-vz5vt\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.552578 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-manager-config\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.557195 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-apiservice-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.557727 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-webhook-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.562475 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.567049 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz5vt\" (UniqueName: \"kubernetes.io/projected/994fe9e1-7adf-4aab-bc9e-d51fd52286a9-kube-api-access-vz5vt\") pod \"loki-operator-controller-manager-5b44bcdc44-bgqfw\" (UID: \"994fe9e1-7adf-4aab-bc9e-d51fd52286a9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.607893 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:43:48 crc kubenswrapper[4886]: I0129 16:43:48.854431 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw"] Jan 29 16:43:49 crc kubenswrapper[4886]: I0129 16:43:49.421889 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" event={"ID":"994fe9e1-7adf-4aab-bc9e-d51fd52286a9","Type":"ContainerStarted","Data":"ba9a5d85b3ffbc6869ac3918f6bc131600276658ec5cc190c42bbcfd7659bf26"} Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.198200 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt"] Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.200776 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt" Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.204661 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.204994 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-vvmn2" Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.205194 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.211604 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt"] Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.312262 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn9zg\" (UniqueName: \"kubernetes.io/projected/7f5851a1-d10c-445d-bffc-12a6acc01ead-kube-api-access-hn9zg\") pod \"cluster-logging-operator-79cf69ddc8-hgdlt\" (UID: \"7f5851a1-d10c-445d-bffc-12a6acc01ead\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt" Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.413193 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9zg\" (UniqueName: \"kubernetes.io/projected/7f5851a1-d10c-445d-bffc-12a6acc01ead-kube-api-access-hn9zg\") pod \"cluster-logging-operator-79cf69ddc8-hgdlt\" (UID: \"7f5851a1-d10c-445d-bffc-12a6acc01ead\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt" Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.440399 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn9zg\" (UniqueName: \"kubernetes.io/projected/7f5851a1-d10c-445d-bffc-12a6acc01ead-kube-api-access-hn9zg\") pod \"cluster-logging-operator-79cf69ddc8-hgdlt\" (UID: \"7f5851a1-d10c-445d-bffc-12a6acc01ead\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt" Jan 29 16:43:52 crc kubenswrapper[4886]: I0129 16:43:52.523690 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt" Jan 29 16:43:54 crc kubenswrapper[4886]: I0129 16:43:54.522249 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt"] Jan 29 16:43:55 crc kubenswrapper[4886]: I0129 16:43:55.473475 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt" event={"ID":"7f5851a1-d10c-445d-bffc-12a6acc01ead","Type":"ContainerStarted","Data":"1c863aaa14c7b6806e471be9f33b5f8232c61b26550f24b07513ac5c9bbb6931"} Jan 29 16:43:55 crc kubenswrapper[4886]: I0129 16:43:55.475438 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" event={"ID":"994fe9e1-7adf-4aab-bc9e-d51fd52286a9","Type":"ContainerStarted","Data":"68813b1abb27e77fc3f9ffa2e46de8cc5d9ca9355ad6ca0972ac29165f1bba50"} Jan 29 16:43:59 crc kubenswrapper[4886]: I0129 16:43:59.661221 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:43:59 crc kubenswrapper[4886]: I0129 16:43:59.661825 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:44:04 crc kubenswrapper[4886]: I0129 16:44:04.543204 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" event={"ID":"994fe9e1-7adf-4aab-bc9e-d51fd52286a9","Type":"ContainerStarted","Data":"6a40817d5e711fbe7de63ecd5931053ea427448bc64bccc055e04dc1036c0cc1"} Jan 29 16:44:04 crc kubenswrapper[4886]: I0129 16:44:04.544786 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:44:04 crc kubenswrapper[4886]: I0129 16:44:04.546079 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" Jan 29 16:44:04 crc kubenswrapper[4886]: I0129 16:44:04.546115 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt" event={"ID":"7f5851a1-d10c-445d-bffc-12a6acc01ead","Type":"ContainerStarted","Data":"ba2cf50913b27ad205a4b605a3888e5d49d8cb1cbb8b48fb51fc3234dabf665e"} Jan 29 16:44:04 crc kubenswrapper[4886]: I0129 16:44:04.566374 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5b44bcdc44-bgqfw" podStartSLOduration=1.154885225 podStartE2EDuration="16.566355739s" podCreationTimestamp="2026-01-29 16:43:48 +0000 UTC" firstStartedPulling="2026-01-29 16:43:48.867139496 +0000 UTC m=+1311.775858758" lastFinishedPulling="2026-01-29 16:44:04.27860999 +0000 UTC m=+1327.187329272" observedRunningTime="2026-01-29 16:44:04.563054077 +0000 UTC m=+1327.471773369" watchObservedRunningTime="2026-01-29 16:44:04.566355739 +0000 UTC m=+1327.475075021" Jan 29 16:44:04 crc kubenswrapper[4886]: I0129 16:44:04.610837 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hgdlt" podStartSLOduration=2.956335112 podStartE2EDuration="12.610815807s" podCreationTimestamp="2026-01-29 16:43:52 +0000 UTC" firstStartedPulling="2026-01-29 16:43:54.537953174 +0000 UTC m=+1317.446672446" lastFinishedPulling="2026-01-29 16:44:04.192433869 +0000 UTC m=+1327.101153141" observedRunningTime="2026-01-29 16:44:04.607603728 +0000 UTC m=+1327.516323020" watchObservedRunningTime="2026-01-29 16:44:04.610815807 +0000 UTC m=+1327.519535099" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.576278 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.577409 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.579544 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.579762 4886 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-slt87" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.587841 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.595029 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.609176 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnvp5\" (UniqueName: \"kubernetes.io/projected/ca730b03-66b8-4129-8cf2-2661a1baae99-kube-api-access-lnvp5\") pod \"minio\" (UID: \"ca730b03-66b8-4129-8cf2-2661a1baae99\") " pod="minio-dev/minio" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.609463 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f92a964-ec52-44d0-bd50-9ea187253084\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f92a964-ec52-44d0-bd50-9ea187253084\") pod \"minio\" (UID: \"ca730b03-66b8-4129-8cf2-2661a1baae99\") " pod="minio-dev/minio" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.710832 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f92a964-ec52-44d0-bd50-9ea187253084\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f92a964-ec52-44d0-bd50-9ea187253084\") pod \"minio\" (UID: \"ca730b03-66b8-4129-8cf2-2661a1baae99\") " pod="minio-dev/minio" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.711303 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnvp5\" (UniqueName: \"kubernetes.io/projected/ca730b03-66b8-4129-8cf2-2661a1baae99-kube-api-access-lnvp5\") pod \"minio\" (UID: \"ca730b03-66b8-4129-8cf2-2661a1baae99\") " pod="minio-dev/minio" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.714125 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.714177 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f92a964-ec52-44d0-bd50-9ea187253084\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f92a964-ec52-44d0-bd50-9ea187253084\") pod \"minio\" (UID: \"ca730b03-66b8-4129-8cf2-2661a1baae99\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ac43d28ac76fe5a6ff50043777e896a20c8968497a85466ecb9b263eeca3a165/globalmount\"" pod="minio-dev/minio" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.742464 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f92a964-ec52-44d0-bd50-9ea187253084\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f92a964-ec52-44d0-bd50-9ea187253084\") pod \"minio\" (UID: \"ca730b03-66b8-4129-8cf2-2661a1baae99\") " pod="minio-dev/minio" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.744403 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnvp5\" (UniqueName: \"kubernetes.io/projected/ca730b03-66b8-4129-8cf2-2661a1baae99-kube-api-access-lnvp5\") pod \"minio\" (UID: \"ca730b03-66b8-4129-8cf2-2661a1baae99\") " pod="minio-dev/minio" Jan 29 16:44:08 crc kubenswrapper[4886]: I0129 16:44:08.900871 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 29 16:44:09 crc kubenswrapper[4886]: I0129 16:44:09.338444 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 29 16:44:09 crc kubenswrapper[4886]: I0129 16:44:09.577267 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"ca730b03-66b8-4129-8cf2-2661a1baae99","Type":"ContainerStarted","Data":"6ce2a9575576a20f51b733648f7267cb6d0a573c22532b8639b0ec3216ff2215"} Jan 29 16:44:14 crc kubenswrapper[4886]: I0129 16:44:14.608195 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"ca730b03-66b8-4129-8cf2-2661a1baae99","Type":"ContainerStarted","Data":"5d8c34c88ba4581d9cf41116cfb3af3c8eb7e4ce38737bd3eb408f90a4d7443e"} Jan 29 16:44:14 crc kubenswrapper[4886]: I0129 16:44:14.626578 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.881121467 podStartE2EDuration="8.62655469s" podCreationTimestamp="2026-01-29 16:44:06 +0000 UTC" firstStartedPulling="2026-01-29 16:44:09.356035145 +0000 UTC m=+1332.264754417" lastFinishedPulling="2026-01-29 16:44:14.101468358 +0000 UTC m=+1337.010187640" observedRunningTime="2026-01-29 16:44:14.623384531 +0000 UTC m=+1337.532103823" watchObservedRunningTime="2026-01-29 16:44:14.62655469 +0000 UTC m=+1337.535273962" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.178712 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.181070 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.183830 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.184057 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.186591 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.186856 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.187169 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-jcxps" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.194713 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.289343 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.289506 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-config\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.289634 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mcqr\" (UniqueName: \"kubernetes.io/projected/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-kube-api-access-7mcqr\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.289700 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.289835 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.346300 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-85zgx"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.347264 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.351202 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.351711 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.351811 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.361936 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-85zgx"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.390914 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.391343 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.391381 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-config\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.391416 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mcqr\" (UniqueName: \"kubernetes.io/projected/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-kube-api-access-7mcqr\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.391445 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.392654 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.392916 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-config\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.396693 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.399911 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.421277 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mcqr\" (UniqueName: \"kubernetes.io/projected/befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1-kube-api-access-7mcqr\") pod \"logging-loki-distributor-5f678c8dd6-2jzzb\" (UID: \"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.439672 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.440779 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.448492 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.448693 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.455881 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.493178 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.493237 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.493307 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-s3\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.493347 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khtrb\" (UniqueName: \"kubernetes.io/projected/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-kube-api-access-khtrb\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.493390 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-config\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.493406 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.510486 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.558193 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-8587c9555d-m4k69"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.559286 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.564085 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-n4kj5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.564356 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.564613 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.564830 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.565655 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.567929 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.576401 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-8587c9555d-cszl5"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.577887 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.585836 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-8587c9555d-m4k69"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.596697 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khtrb\" (UniqueName: \"kubernetes.io/projected/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-kube-api-access-khtrb\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.596755 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.596807 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-config\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.596830 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.596882 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.596924 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.596955 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.596978 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz7sk\" (UniqueName: \"kubernetes.io/projected/fa3af54b-5759-4b53-a998-720bd2ff4608-kube-api-access-kz7sk\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.597006 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.597037 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3af54b-5759-4b53-a998-720bd2ff4608-config\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.597065 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-s3\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.600495 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-8587c9555d-cszl5"] Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.602390 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-config\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.602740 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.609958 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.611723 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.621826 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-logging-loki-s3\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.652406 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khtrb\" (UniqueName: \"kubernetes.io/projected/fb80c257-3e6a-45c8-bb6f-6fb2676ef296-kube-api-access-khtrb\") pod \"logging-loki-querier-76788598db-85zgx\" (UID: \"fb80c257-3e6a-45c8-bb6f-6fb2676ef296\") " pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.669693 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.707336 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgxgf\" (UniqueName: \"kubernetes.io/projected/046307bd-2e5e-4d92-b934-57ed8882d1bc-kube-api-access-wgxgf\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.707396 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.707422 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.707445 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.707858 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-tenants\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.707978 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-tls-secret\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708054 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tgmn\" (UniqueName: \"kubernetes.io/projected/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-kube-api-access-8tgmn\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708365 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-tenants\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708419 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-tls-secret\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708439 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-rbac\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708493 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708535 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708554 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-lokistack-gateway\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708584 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708613 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708673 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708732 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz7sk\" (UniqueName: \"kubernetes.io/projected/fa3af54b-5759-4b53-a998-720bd2ff4608-kube-api-access-kz7sk\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708768 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-lokistack-gateway\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708791 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708811 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-rbac\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.708838 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3af54b-5759-4b53-a998-720bd2ff4608-config\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.710439 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3af54b-5759-4b53-a998-720bd2ff4608-config\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.714065 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.727464 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.727776 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/fa3af54b-5759-4b53-a998-720bd2ff4608-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.735304 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz7sk\" (UniqueName: \"kubernetes.io/projected/fa3af54b-5759-4b53-a998-720bd2ff4608-kube-api-access-kz7sk\") pod \"logging-loki-query-frontend-69d9546745-9q2lr\" (UID: \"fa3af54b-5759-4b53-a998-720bd2ff4608\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.782213 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810006 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-lokistack-gateway\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810046 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810070 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-rbac\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810096 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgxgf\" (UniqueName: \"kubernetes.io/projected/046307bd-2e5e-4d92-b934-57ed8882d1bc-kube-api-access-wgxgf\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810120 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810141 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810155 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-tenants\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810172 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-tls-secret\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810189 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tgmn\" (UniqueName: \"kubernetes.io/projected/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-kube-api-access-8tgmn\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810212 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-tenants\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810234 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-tls-secret\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810249 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-rbac\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810272 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810291 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-lokistack-gateway\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810310 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.810343 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.811093 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.812874 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-rbac\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.813839 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-rbac\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.816492 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.816555 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-tenants\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.816656 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-lokistack-gateway\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.816756 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.817381 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.821293 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-lokistack-gateway\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.825526 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-tls-secret\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.825698 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.825737 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-tenants\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.835658 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-tls-secret\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.835805 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/046307bd-2e5e-4d92-b934-57ed8882d1bc-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.841926 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgxgf\" (UniqueName: \"kubernetes.io/projected/046307bd-2e5e-4d92-b934-57ed8882d1bc-kube-api-access-wgxgf\") pod \"logging-loki-gateway-8587c9555d-m4k69\" (UID: \"046307bd-2e5e-4d92-b934-57ed8882d1bc\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.844042 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tgmn\" (UniqueName: \"kubernetes.io/projected/c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b-kube-api-access-8tgmn\") pod \"logging-loki-gateway-8587c9555d-cszl5\" (UID: \"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b\") " pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.921838 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:20 crc kubenswrapper[4886]: I0129 16:44:20.970958 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:20.999440 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb"] Jan 29 16:44:21 crc kubenswrapper[4886]: W0129 16:44:21.007079 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbefd63fe_2ae3_4bb3_86fd_ac5486d7fbd1.slice/crio-8af0157bba70a29b5bc7d3c507bd6e596e9c97f47fb5e2fb053c7978b2ddd013 WatchSource:0}: Error finding container 8af0157bba70a29b5bc7d3c507bd6e596e9c97f47fb5e2fb053c7978b2ddd013: Status 404 returned error can't find the container with id 8af0157bba70a29b5bc7d3c507bd6e596e9c97f47fb5e2fb053c7978b2ddd013 Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.193020 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-85zgx"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.324556 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.337191 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-8587c9555d-m4k69"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.341575 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.342561 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.345089 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.345292 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.348312 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.478026 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.479081 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.481446 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.482149 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.483257 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.505163 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.505972 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.511315 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.511985 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.523933 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.524548 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.524650 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.524684 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22mlm\" (UniqueName: \"kubernetes.io/projected/0dd1a523-96c1-4311-9452-92e6da8a7e9b-kube-api-access-22mlm\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.524710 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.524733 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd1a523-96c1-4311-9452-92e6da8a7e9b-config\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.524754 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4fdac16a-ee35-40f8-903b-eb0d0da233ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4fdac16a-ee35-40f8-903b-eb0d0da233ab\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.524784 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1af84dd6-0683-4c8c-b3e8-62d9ada051fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1af84dd6-0683-4c8c-b3e8-62d9ada051fe\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.524816 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.568139 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-8587c9555d-cszl5"] Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.626739 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-config\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.626800 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.626824 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.626858 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.626882 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627038 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627121 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627177 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627212 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22mlm\" (UniqueName: \"kubernetes.io/projected/0dd1a523-96c1-4311-9452-92e6da8a7e9b-kube-api-access-22mlm\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627288 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627353 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f5f1690-e741-43ac-b894-10ed3cbabe48\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f5f1690-e741-43ac-b894-10ed3cbabe48\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627396 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4fdac16a-ee35-40f8-903b-eb0d0da233ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4fdac16a-ee35-40f8-903b-eb0d0da233ab\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627452 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627528 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627559 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627647 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7qv5\" (UniqueName: \"kubernetes.io/projected/6059a5a7-5b65-481d-9b0f-f40d863e8310-kube-api-access-g7qv5\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627708 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4c66\" (UniqueName: \"kubernetes.io/projected/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-kube-api-access-j4c66\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627735 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6059a5a7-5b65-481d-9b0f-f40d863e8310-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627791 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0c763bef-e323-4b7f-ab21-3f0f2ab7b02d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c763bef-e323-4b7f-ab21-3f0f2ab7b02d\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627848 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd1a523-96c1-4311-9452-92e6da8a7e9b-config\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627888 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1af84dd6-0683-4c8c-b3e8-62d9ada051fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1af84dd6-0683-4c8c-b3e8-62d9ada051fe\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.627916 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.631036 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.631654 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd1a523-96c1-4311-9452-92e6da8a7e9b-config\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.638139 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.638185 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.638399 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/0dd1a523-96c1-4311-9452-92e6da8a7e9b-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.640242 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.640360 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4fdac16a-ee35-40f8-903b-eb0d0da233ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4fdac16a-ee35-40f8-903b-eb0d0da233ab\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dfefc5fccf628ea15fdfe7921099c01b4bb138ba6509b8a6c369076c47177cbf/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.644703 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.647709 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22mlm\" (UniqueName: \"kubernetes.io/projected/0dd1a523-96c1-4311-9452-92e6da8a7e9b-kube-api-access-22mlm\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.649464 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1af84dd6-0683-4c8c-b3e8-62d9ada051fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1af84dd6-0683-4c8c-b3e8-62d9ada051fe\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5b3978f589a62496479536024629daab06f7d4d39c9730314cdaa09e60ca86e3/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.665729 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" event={"ID":"fa3af54b-5759-4b53-a998-720bd2ff4608","Type":"ContainerStarted","Data":"bb95c23a9849ae26eaf9b7e2223192a8b25a7a63dfa2b7aef6d6de9edbdb7474"} Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.666715 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" event={"ID":"046307bd-2e5e-4d92-b934-57ed8882d1bc","Type":"ContainerStarted","Data":"c0f31a01bd117232cb2946e2cf38a8076865b0b71d4b859457f95dc2897a3304"} Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.667837 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" event={"ID":"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b","Type":"ContainerStarted","Data":"34f0563aa3055d8c201e2d035e1a56d218e891682f5f9c02b4dae8c4441563f6"} Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.670103 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" event={"ID":"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1","Type":"ContainerStarted","Data":"8af0157bba70a29b5bc7d3c507bd6e596e9c97f47fb5e2fb053c7978b2ddd013"} Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.671319 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-85zgx" event={"ID":"fb80c257-3e6a-45c8-bb6f-6fb2676ef296","Type":"ContainerStarted","Data":"77ca3f196adfbd0d8677f15cf0e5d57bd3c0a4db27bf1aa94440340f876353f4"} Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.672160 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4fdac16a-ee35-40f8-903b-eb0d0da233ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4fdac16a-ee35-40f8-903b-eb0d0da233ab\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.679431 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1af84dd6-0683-4c8c-b3e8-62d9ada051fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1af84dd6-0683-4c8c-b3e8-62d9ada051fe\") pod \"logging-loki-ingester-0\" (UID: \"0dd1a523-96c1-4311-9452-92e6da8a7e9b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.729777 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.729844 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7qv5\" (UniqueName: \"kubernetes.io/projected/6059a5a7-5b65-481d-9b0f-f40d863e8310-kube-api-access-g7qv5\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.729869 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4c66\" (UniqueName: \"kubernetes.io/projected/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-kube-api-access-j4c66\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.729884 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6059a5a7-5b65-481d-9b0f-f40d863e8310-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.729917 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0c763bef-e323-4b7f-ab21-3f0f2ab7b02d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c763bef-e323-4b7f-ab21-3f0f2ab7b02d\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.729941 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.729975 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-config\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.730007 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.730026 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.730042 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.730066 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.730085 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.730109 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f5f1690-e741-43ac-b894-10ed3cbabe48\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f5f1690-e741-43ac-b894-10ed3cbabe48\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.730132 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.730953 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.731978 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.731996 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-config\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.732363 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6059a5a7-5b65-481d-9b0f-f40d863e8310-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.733429 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.733442 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.733471 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0c763bef-e323-4b7f-ab21-3f0f2ab7b02d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c763bef-e323-4b7f-ab21-3f0f2ab7b02d\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3dacbae4f7a33d2ee419c7c6a0927f4eea7710cf7144d70769ad46cc2dc1508a/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.733483 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f5f1690-e741-43ac-b894-10ed3cbabe48\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f5f1690-e741-43ac-b894-10ed3cbabe48\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3909a1c7239f5f2fe75c2ea0c916f7ceb2a5b008d6c55126ad2d53193ffa5c3c/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.734260 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.734360 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.734674 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.734895 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6059a5a7-5b65-481d-9b0f-f40d863e8310-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.735940 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.744798 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.747895 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4c66\" (UniqueName: \"kubernetes.io/projected/37c313cd-31f0-4fb3-9241-a3a59b1f55a6-kube-api-access-j4c66\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.755057 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7qv5\" (UniqueName: \"kubernetes.io/projected/6059a5a7-5b65-481d-9b0f-f40d863e8310-kube-api-access-g7qv5\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.759697 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0c763bef-e323-4b7f-ab21-3f0f2ab7b02d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c763bef-e323-4b7f-ab21-3f0f2ab7b02d\") pod \"logging-loki-index-gateway-0\" (UID: \"6059a5a7-5b65-481d-9b0f-f40d863e8310\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.775685 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f5f1690-e741-43ac-b894-10ed3cbabe48\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f5f1690-e741-43ac-b894-10ed3cbabe48\") pod \"logging-loki-compactor-0\" (UID: \"37c313cd-31f0-4fb3-9241-a3a59b1f55a6\") " pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.806035 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.824581 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:21 crc kubenswrapper[4886]: I0129 16:44:21.969971 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:22 crc kubenswrapper[4886]: I0129 16:44:22.244880 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 29 16:44:22 crc kubenswrapper[4886]: W0129 16:44:22.245719 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37c313cd_31f0_4fb3_9241_a3a59b1f55a6.slice/crio-a551dc8be456869ea1e1222b18e854616a4c7e4dace41621eb275eb60b96cd55 WatchSource:0}: Error finding container a551dc8be456869ea1e1222b18e854616a4c7e4dace41621eb275eb60b96cd55: Status 404 returned error can't find the container with id a551dc8be456869ea1e1222b18e854616a4c7e4dace41621eb275eb60b96cd55 Jan 29 16:44:22 crc kubenswrapper[4886]: I0129 16:44:22.307602 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 29 16:44:22 crc kubenswrapper[4886]: W0129 16:44:22.315237 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6059a5a7_5b65_481d_9b0f_f40d863e8310.slice/crio-b45347c54ba64555976f02d8b2d5db19c6794894f5b3f77c3da6f026a87848ab WatchSource:0}: Error finding container b45347c54ba64555976f02d8b2d5db19c6794894f5b3f77c3da6f026a87848ab: Status 404 returned error can't find the container with id b45347c54ba64555976f02d8b2d5db19c6794894f5b3f77c3da6f026a87848ab Jan 29 16:44:22 crc kubenswrapper[4886]: I0129 16:44:22.416426 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 29 16:44:22 crc kubenswrapper[4886]: I0129 16:44:22.678747 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"37c313cd-31f0-4fb3-9241-a3a59b1f55a6","Type":"ContainerStarted","Data":"a551dc8be456869ea1e1222b18e854616a4c7e4dace41621eb275eb60b96cd55"} Jan 29 16:44:22 crc kubenswrapper[4886]: I0129 16:44:22.680798 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"0dd1a523-96c1-4311-9452-92e6da8a7e9b","Type":"ContainerStarted","Data":"8e80501c245fd584742ca0aeeba230a29fbeddca37fb0c1cb655df5f1e1f2e3d"} Jan 29 16:44:22 crc kubenswrapper[4886]: I0129 16:44:22.681859 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"6059a5a7-5b65-481d-9b0f-f40d863e8310","Type":"ContainerStarted","Data":"b45347c54ba64555976f02d8b2d5db19c6794894f5b3f77c3da6f026a87848ab"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.713855 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" event={"ID":"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b","Type":"ContainerStarted","Data":"11555333d970cf0b5c68a36387a912e24adea362f7935b44573ae3fd14f4ac21"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.715764 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"0dd1a523-96c1-4311-9452-92e6da8a7e9b","Type":"ContainerStarted","Data":"7686228f02477b7ff31b7e28ac5f0c82132ef45f9b6f7fba4b4633855e191242"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.715818 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.717572 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" event={"ID":"befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1","Type":"ContainerStarted","Data":"804e2daa82e34c76d8b1f2bedac109c5769096ed12aa8dd35163911432df9432"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.717709 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.719132 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-85zgx" event={"ID":"fb80c257-3e6a-45c8-bb6f-6fb2676ef296","Type":"ContainerStarted","Data":"afeb486c3647cf154609c0757d87fc078c0f9cec0dafdc955d849b9054c655ef"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.719248 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.720238 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"6059a5a7-5b65-481d-9b0f-f40d863e8310","Type":"ContainerStarted","Data":"b046bed7cdcbf761144683f50ae015d81d5c196ab45a554254960d666e3ae48e"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.720375 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.721430 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" event={"ID":"046307bd-2e5e-4d92-b934-57ed8882d1bc","Type":"ContainerStarted","Data":"b58f31f74619068ed2a987c2c19a9c7c9d04c3ee32ad011a41acca1d9ae2c126"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.722562 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" event={"ID":"fa3af54b-5759-4b53-a998-720bd2ff4608","Type":"ContainerStarted","Data":"7492769110a81fcaf0a6c529adb508a56b0abd143bd844812b0cb5eb702882ff"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.723195 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.725448 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"37c313cd-31f0-4fb3-9241-a3a59b1f55a6","Type":"ContainerStarted","Data":"d5e326b3ffa182ca9ac8c50df1d959306c0b45b8ac2b6b70cdbc1d9e40f63b3d"} Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.725590 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.749692 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.033628762 podStartE2EDuration="5.749676536s" podCreationTimestamp="2026-01-29 16:44:20 +0000 UTC" firstStartedPulling="2026-01-29 16:44:22.440501216 +0000 UTC m=+1345.349220488" lastFinishedPulling="2026-01-29 16:44:25.15654899 +0000 UTC m=+1348.065268262" observedRunningTime="2026-01-29 16:44:25.739575455 +0000 UTC m=+1348.648294727" watchObservedRunningTime="2026-01-29 16:44:25.749676536 +0000 UTC m=+1348.658395808" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.765276 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=2.92225166 podStartE2EDuration="5.765257931s" podCreationTimestamp="2026-01-29 16:44:20 +0000 UTC" firstStartedPulling="2026-01-29 16:44:22.317651043 +0000 UTC m=+1345.226370315" lastFinishedPulling="2026-01-29 16:44:25.160657314 +0000 UTC m=+1348.069376586" observedRunningTime="2026-01-29 16:44:25.757159365 +0000 UTC m=+1348.665878637" watchObservedRunningTime="2026-01-29 16:44:25.765257931 +0000 UTC m=+1348.673977203" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.781955 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" podStartSLOduration=1.9495604960000001 podStartE2EDuration="5.781931675s" podCreationTimestamp="2026-01-29 16:44:20 +0000 UTC" firstStartedPulling="2026-01-29 16:44:21.332857473 +0000 UTC m=+1344.241576745" lastFinishedPulling="2026-01-29 16:44:25.165228652 +0000 UTC m=+1348.073947924" observedRunningTime="2026-01-29 16:44:25.779480217 +0000 UTC m=+1348.688199509" watchObservedRunningTime="2026-01-29 16:44:25.781931675 +0000 UTC m=+1348.690650947" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.821809 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.9610464690000002 podStartE2EDuration="5.821781285s" podCreationTimestamp="2026-01-29 16:44:20 +0000 UTC" firstStartedPulling="2026-01-29 16:44:22.250123731 +0000 UTC m=+1345.158843013" lastFinishedPulling="2026-01-29 16:44:25.110858557 +0000 UTC m=+1348.019577829" observedRunningTime="2026-01-29 16:44:25.797826438 +0000 UTC m=+1348.706545710" watchObservedRunningTime="2026-01-29 16:44:25.821781285 +0000 UTC m=+1348.730500567" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.836001 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" podStartSLOduration=1.6677709040000002 podStartE2EDuration="5.835980571s" podCreationTimestamp="2026-01-29 16:44:20 +0000 UTC" firstStartedPulling="2026-01-29 16:44:21.016021154 +0000 UTC m=+1343.924740426" lastFinishedPulling="2026-01-29 16:44:25.184230821 +0000 UTC m=+1348.092950093" observedRunningTime="2026-01-29 16:44:25.825918531 +0000 UTC m=+1348.734637803" watchObservedRunningTime="2026-01-29 16:44:25.835980571 +0000 UTC m=+1348.744699853" Jan 29 16:44:25 crc kubenswrapper[4886]: I0129 16:44:25.845345 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76788598db-85zgx" podStartSLOduration=1.8790429309999999 podStartE2EDuration="5.845307351s" podCreationTimestamp="2026-01-29 16:44:20 +0000 UTC" firstStartedPulling="2026-01-29 16:44:21.22294511 +0000 UTC m=+1344.131664382" lastFinishedPulling="2026-01-29 16:44:25.18920953 +0000 UTC m=+1348.097928802" observedRunningTime="2026-01-29 16:44:25.842442591 +0000 UTC m=+1348.751161883" watchObservedRunningTime="2026-01-29 16:44:25.845307351 +0000 UTC m=+1348.754026623" Jan 29 16:44:29 crc kubenswrapper[4886]: I0129 16:44:29.661383 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:44:29 crc kubenswrapper[4886]: I0129 16:44:29.661652 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:44:29 crc kubenswrapper[4886]: I0129 16:44:29.759249 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" event={"ID":"c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b","Type":"ContainerStarted","Data":"a59042ea205bed00605dd73fa40dc9f973e22640b33de559f4f2879ed5df1cda"} Jan 29 16:44:29 crc kubenswrapper[4886]: I0129 16:44:29.760166 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:29 crc kubenswrapper[4886]: I0129 16:44:29.760431 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:29 crc kubenswrapper[4886]: I0129 16:44:29.778835 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:29 crc kubenswrapper[4886]: I0129 16:44:29.780642 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" Jan 29 16:44:29 crc kubenswrapper[4886]: I0129 16:44:29.795592 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-8587c9555d-cszl5" podStartSLOduration=2.002796131 podStartE2EDuration="9.795572806s" podCreationTimestamp="2026-01-29 16:44:20 +0000 UTC" firstStartedPulling="2026-01-29 16:44:21.575021451 +0000 UTC m=+1344.483740723" lastFinishedPulling="2026-01-29 16:44:29.367798126 +0000 UTC m=+1352.276517398" observedRunningTime="2026-01-29 16:44:29.784278152 +0000 UTC m=+1352.692997424" watchObservedRunningTime="2026-01-29 16:44:29.795572806 +0000 UTC m=+1352.704292088" Jan 29 16:44:36 crc kubenswrapper[4886]: I0129 16:44:36.822947 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" event={"ID":"046307bd-2e5e-4d92-b934-57ed8882d1bc","Type":"ContainerStarted","Data":"0623f2702768ecd82ed023b2c9c84d2e0d51b9b0e6841d9171ff5498cf034bc7"} Jan 29 16:44:36 crc kubenswrapper[4886]: I0129 16:44:36.823395 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:36 crc kubenswrapper[4886]: I0129 16:44:36.837549 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:36 crc kubenswrapper[4886]: I0129 16:44:36.853779 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" podStartSLOduration=2.395511883 podStartE2EDuration="16.853751375s" podCreationTimestamp="2026-01-29 16:44:20 +0000 UTC" firstStartedPulling="2026-01-29 16:44:21.337396419 +0000 UTC m=+1344.246115691" lastFinishedPulling="2026-01-29 16:44:35.795635881 +0000 UTC m=+1358.704355183" observedRunningTime="2026-01-29 16:44:36.847170642 +0000 UTC m=+1359.755889934" watchObservedRunningTime="2026-01-29 16:44:36.853751375 +0000 UTC m=+1359.762470667" Jan 29 16:44:37 crc kubenswrapper[4886]: I0129 16:44:37.831372 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:37 crc kubenswrapper[4886]: I0129 16:44:37.847675 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-8587c9555d-m4k69" Jan 29 16:44:40 crc kubenswrapper[4886]: I0129 16:44:40.520844 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" Jan 29 16:44:40 crc kubenswrapper[4886]: I0129 16:44:40.678693 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76788598db-85zgx" Jan 29 16:44:40 crc kubenswrapper[4886]: I0129 16:44:40.788291 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-69d9546745-9q2lr" Jan 29 16:44:41 crc kubenswrapper[4886]: I0129 16:44:41.815890 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Jan 29 16:44:41 crc kubenswrapper[4886]: I0129 16:44:41.831080 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Jan 29 16:44:41 crc kubenswrapper[4886]: I0129 16:44:41.977530 4886 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Jan 29 16:44:41 crc kubenswrapper[4886]: I0129 16:44:41.977592 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="0dd1a523-96c1-4311-9452-92e6da8a7e9b" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 29 16:44:51 crc kubenswrapper[4886]: I0129 16:44:51.978526 4886 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Jan 29 16:44:51 crc kubenswrapper[4886]: I0129 16:44:51.979092 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="0dd1a523-96c1-4311-9452-92e6da8a7e9b" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 29 16:44:59 crc kubenswrapper[4886]: I0129 16:44:59.660697 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:44:59 crc kubenswrapper[4886]: I0129 16:44:59.661067 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:44:59 crc kubenswrapper[4886]: I0129 16:44:59.661119 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:44:59 crc kubenswrapper[4886]: I0129 16:44:59.661852 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e07342110c4b02787cb4723c63fa377397be4b574d1be34193ab1f7b4cebac54"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:44:59 crc kubenswrapper[4886]: I0129 16:44:59.661918 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://e07342110c4b02787cb4723c63fa377397be4b574d1be34193ab1f7b4cebac54" gracePeriod=600 Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.024191 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="e07342110c4b02787cb4723c63fa377397be4b574d1be34193ab1f7b4cebac54" exitCode=0 Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.024250 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"e07342110c4b02787cb4723c63fa377397be4b574d1be34193ab1f7b4cebac54"} Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.024296 4886 scope.go:117] "RemoveContainer" containerID="84a645b31233e6f6691e7af3a8d18c33f1db7629388f3007d7e51e43f9f65e97" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.157639 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr"] Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.160370 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.162904 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.163760 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.166849 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr"] Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.258213 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-config-volume\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.258268 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfdz7\" (UniqueName: \"kubernetes.io/projected/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-kube-api-access-dfdz7\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.258468 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-secret-volume\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.360105 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-config-volume\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.360151 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfdz7\" (UniqueName: \"kubernetes.io/projected/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-kube-api-access-dfdz7\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.360215 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-secret-volume\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.362133 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-config-volume\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.368097 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-secret-volume\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.387664 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfdz7\" (UniqueName: \"kubernetes.io/projected/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-kube-api-access-dfdz7\") pod \"collect-profiles-29495085-rzdqr\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.493907 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:00 crc kubenswrapper[4886]: I0129 16:45:00.955453 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr"] Jan 29 16:45:00 crc kubenswrapper[4886]: W0129 16:45:00.968795 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a04871a_41ba_40fc_bfb0_ca8f308e9b01.slice/crio-07fb4c9195f3111e975be0d4d67ac8f418ab546897a410bb0eb6ff30585cce6b WatchSource:0}: Error finding container 07fb4c9195f3111e975be0d4d67ac8f418ab546897a410bb0eb6ff30585cce6b: Status 404 returned error can't find the container with id 07fb4c9195f3111e975be0d4d67ac8f418ab546897a410bb0eb6ff30585cce6b Jan 29 16:45:01 crc kubenswrapper[4886]: I0129 16:45:01.037146 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463"} Jan 29 16:45:01 crc kubenswrapper[4886]: I0129 16:45:01.039123 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" event={"ID":"0a04871a-41ba-40fc-bfb0-ca8f308e9b01","Type":"ContainerStarted","Data":"07fb4c9195f3111e975be0d4d67ac8f418ab546897a410bb0eb6ff30585cce6b"} Jan 29 16:45:01 crc kubenswrapper[4886]: I0129 16:45:01.973257 4886 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Jan 29 16:45:01 crc kubenswrapper[4886]: I0129 16:45:01.973655 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="0dd1a523-96c1-4311-9452-92e6da8a7e9b" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 29 16:45:02 crc kubenswrapper[4886]: I0129 16:45:02.045755 4886 generic.go:334] "Generic (PLEG): container finished" podID="0a04871a-41ba-40fc-bfb0-ca8f308e9b01" containerID="11c1455f9476b08d8f802dd75f2ecc6d25f6377ab593571ce7bee30aa00fa339" exitCode=0 Jan 29 16:45:02 crc kubenswrapper[4886]: I0129 16:45:02.046281 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" event={"ID":"0a04871a-41ba-40fc-bfb0-ca8f308e9b01","Type":"ContainerDied","Data":"11c1455f9476b08d8f802dd75f2ecc6d25f6377ab593571ce7bee30aa00fa339"} Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.334716 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.519559 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfdz7\" (UniqueName: \"kubernetes.io/projected/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-kube-api-access-dfdz7\") pod \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.519672 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-config-volume\") pod \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.519738 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-secret-volume\") pod \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\" (UID: \"0a04871a-41ba-40fc-bfb0-ca8f308e9b01\") " Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.521501 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-config-volume" (OuterVolumeSpecName: "config-volume") pod "0a04871a-41ba-40fc-bfb0-ca8f308e9b01" (UID: "0a04871a-41ba-40fc-bfb0-ca8f308e9b01"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.524781 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-kube-api-access-dfdz7" (OuterVolumeSpecName: "kube-api-access-dfdz7") pod "0a04871a-41ba-40fc-bfb0-ca8f308e9b01" (UID: "0a04871a-41ba-40fc-bfb0-ca8f308e9b01"). InnerVolumeSpecName "kube-api-access-dfdz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.524925 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0a04871a-41ba-40fc-bfb0-ca8f308e9b01" (UID: "0a04871a-41ba-40fc-bfb0-ca8f308e9b01"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.621691 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfdz7\" (UniqueName: \"kubernetes.io/projected/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-kube-api-access-dfdz7\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.621743 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:03 crc kubenswrapper[4886]: I0129 16:45:03.621771 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a04871a-41ba-40fc-bfb0-ca8f308e9b01-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:04 crc kubenswrapper[4886]: I0129 16:45:04.059221 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" event={"ID":"0a04871a-41ba-40fc-bfb0-ca8f308e9b01","Type":"ContainerDied","Data":"07fb4c9195f3111e975be0d4d67ac8f418ab546897a410bb0eb6ff30585cce6b"} Jan 29 16:45:04 crc kubenswrapper[4886]: I0129 16:45:04.059257 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07fb4c9195f3111e975be0d4d67ac8f418ab546897a410bb0eb6ff30585cce6b" Jan 29 16:45:04 crc kubenswrapper[4886]: I0129 16:45:04.059258 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr" Jan 29 16:45:11 crc kubenswrapper[4886]: I0129 16:45:11.977927 4886 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Jan 29 16:45:11 crc kubenswrapper[4886]: I0129 16:45:11.978532 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="0dd1a523-96c1-4311-9452-92e6da8a7e9b" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 29 16:45:21 crc kubenswrapper[4886]: I0129 16:45:21.977096 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.224047 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-kp57g"] Jan 29 16:45:39 crc kubenswrapper[4886]: E0129 16:45:39.224867 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a04871a-41ba-40fc-bfb0-ca8f308e9b01" containerName="collect-profiles" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.224882 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a04871a-41ba-40fc-bfb0-ca8f308e9b01" containerName="collect-profiles" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.225032 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a04871a-41ba-40fc-bfb0-ca8f308e9b01" containerName="collect-profiles" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.225647 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.243583 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.244920 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.245233 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.246465 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.246685 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-vk7pr" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.260367 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.260409 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-kp57g"] Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.308179 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-kp57g"] Jan 29 16:45:39 crc kubenswrapper[4886]: E0129 16:45:39.309126 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-7ndgz metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-kp57g" podUID="0fdf3fef-2955-4239-bac3-5fa54858ca90" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.357564 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config-openshift-service-cacrt\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.357733 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-sa-token\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.357826 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fdf3fef-2955-4239-bac3-5fa54858ca90-tmp\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.357921 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.357957 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0fdf3fef-2955-4239-bac3-5fa54858ca90-datadir\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.358003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.358106 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ndgz\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-kube-api-access-7ndgz\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.358173 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.359425 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-entrypoint\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.359476 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-token\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.360051 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-trusted-ca\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.368053 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.379762 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.463802 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config-openshift-service-cacrt\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.463845 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-sa-token\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.463884 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fdf3fef-2955-4239-bac3-5fa54858ca90-tmp\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.463925 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.463944 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0fdf3fef-2955-4239-bac3-5fa54858ca90-datadir\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.463966 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.463981 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ndgz\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-kube-api-access-7ndgz\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.463996 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.464009 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-entrypoint\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.464025 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-token\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.464059 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-trusted-ca\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.464504 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config-openshift-service-cacrt\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: E0129 16:45:39.464736 4886 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Jan 29 16:45:39 crc kubenswrapper[4886]: E0129 16:45:39.464854 4886 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Jan 29 16:45:39 crc kubenswrapper[4886]: E0129 16:45:39.464918 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics podName:0fdf3fef-2955-4239-bac3-5fa54858ca90 nodeName:}" failed. No retries permitted until 2026-01-29 16:45:39.964892587 +0000 UTC m=+1422.873611859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics") pod "collector-kp57g" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90") : secret "collector-metrics" not found Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.464957 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0fdf3fef-2955-4239-bac3-5fa54858ca90-datadir\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.465024 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: E0129 16:45:39.465198 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver podName:0fdf3fef-2955-4239-bac3-5fa54858ca90 nodeName:}" failed. No retries permitted until 2026-01-29 16:45:39.965181225 +0000 UTC m=+1422.873900517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver") pod "collector-kp57g" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90") : secret "collector-syslog-receiver" not found Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.465369 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-entrypoint\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.465761 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-trusted-ca\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.476873 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-token\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.477912 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fdf3fef-2955-4239-bac3-5fa54858ca90-tmp\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.482909 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ndgz\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-kube-api-access-7ndgz\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.487535 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-sa-token\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.564800 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config-openshift-service-cacrt\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565104 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565229 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-trusted-ca\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565351 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-token\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565481 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-entrypoint\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565592 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0fdf3fef-2955-4239-bac3-5fa54858ca90-datadir\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565567 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565649 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565958 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.565985 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fdf3fef-2955-4239-bac3-5fa54858ca90-datadir" (OuterVolumeSpecName: "datadir") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.566533 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config" (OuterVolumeSpecName: "config") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.568770 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-token" (OuterVolumeSpecName: "collector-token") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.667834 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ndgz\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-kube-api-access-7ndgz\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.667886 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-sa-token\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.668014 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fdf3fef-2955-4239-bac3-5fa54858ca90-tmp\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.668429 4886 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.668444 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.668454 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.668463 4886 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-token\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.668473 4886 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0fdf3fef-2955-4239-bac3-5fa54858ca90-entrypoint\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.668480 4886 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0fdf3fef-2955-4239-bac3-5fa54858ca90-datadir\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.671190 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-kube-api-access-7ndgz" (OuterVolumeSpecName: "kube-api-access-7ndgz") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "kube-api-access-7ndgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.671246 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-sa-token" (OuterVolumeSpecName: "sa-token") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.671355 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fdf3fef-2955-4239-bac3-5fa54858ca90-tmp" (OuterVolumeSpecName: "tmp") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.770062 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ndgz\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-kube-api-access-7ndgz\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.770090 4886 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0fdf3fef-2955-4239-bac3-5fa54858ca90-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.770099 4886 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fdf3fef-2955-4239-bac3-5fa54858ca90-tmp\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.972643 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.972774 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.976020 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:39 crc kubenswrapper[4886]: I0129 16:45:39.977617 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics\") pod \"collector-kp57g\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " pod="openshift-logging/collector-kp57g" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.073771 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.073992 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver\") pod \"0fdf3fef-2955-4239-bac3-5fa54858ca90\" (UID: \"0fdf3fef-2955-4239-bac3-5fa54858ca90\") " Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.077611 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.078465 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics" (OuterVolumeSpecName: "metrics") pod "0fdf3fef-2955-4239-bac3-5fa54858ca90" (UID: "0fdf3fef-2955-4239-bac3-5fa54858ca90"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.176568 4886 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-metrics\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.176917 4886 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0fdf3fef-2955-4239-bac3-5fa54858ca90-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.375771 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-kp57g" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.445952 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-kp57g"] Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.452936 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-kp57g"] Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.467056 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-qnmmn"] Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.469020 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.472092 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.472510 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.472859 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.473361 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.473760 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-vk7pr" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.478137 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-qnmmn"] Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.480416 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.585932 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqsbr\" (UniqueName: \"kubernetes.io/projected/bd8dc819-215b-44f5-b758-9bac32be60f5-kube-api-access-vqsbr\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.586035 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-trusted-ca\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.586119 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/bd8dc819-215b-44f5-b758-9bac32be60f5-datadir\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.586140 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/bd8dc819-215b-44f5-b758-9bac32be60f5-sa-token\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.586160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-collector-syslog-receiver\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.586497 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-collector-token\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.586828 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-metrics\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.586908 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-config\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.586982 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-entrypoint\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.587134 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-config-openshift-service-cacrt\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.587293 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd8dc819-215b-44f5-b758-9bac32be60f5-tmp\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.624631 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fdf3fef-2955-4239-bac3-5fa54858ca90" path="/var/lib/kubelet/pods/0fdf3fef-2955-4239-bac3-5fa54858ca90/volumes" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.689779 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/bd8dc819-215b-44f5-b758-9bac32be60f5-datadir\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.689866 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/bd8dc819-215b-44f5-b758-9bac32be60f5-datadir\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.689868 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/bd8dc819-215b-44f5-b758-9bac32be60f5-sa-token\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.689980 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-collector-syslog-receiver\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.690084 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-collector-token\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.690218 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-metrics\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.690280 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-config\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.690350 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-entrypoint\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.690416 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-config-openshift-service-cacrt\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.690483 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd8dc819-215b-44f5-b758-9bac32be60f5-tmp\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.690718 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqsbr\" (UniqueName: \"kubernetes.io/projected/bd8dc819-215b-44f5-b758-9bac32be60f5-kube-api-access-vqsbr\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.690762 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-trusted-ca\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.691745 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-config\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.691755 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-config-openshift-service-cacrt\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.691975 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-trusted-ca\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.694544 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-collector-token\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.695861 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd8dc819-215b-44f5-b758-9bac32be60f5-tmp\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.697006 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-collector-syslog-receiver\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.698288 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/bd8dc819-215b-44f5-b758-9bac32be60f5-metrics\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.714810 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/bd8dc819-215b-44f5-b758-9bac32be60f5-entrypoint\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.719099 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqsbr\" (UniqueName: \"kubernetes.io/projected/bd8dc819-215b-44f5-b758-9bac32be60f5-kube-api-access-vqsbr\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.723577 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/bd8dc819-215b-44f5-b758-9bac32be60f5-sa-token\") pod \"collector-qnmmn\" (UID: \"bd8dc819-215b-44f5-b758-9bac32be60f5\") " pod="openshift-logging/collector-qnmmn" Jan 29 16:45:40 crc kubenswrapper[4886]: I0129 16:45:40.826394 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qnmmn" Jan 29 16:45:41 crc kubenswrapper[4886]: I0129 16:45:41.270773 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-qnmmn"] Jan 29 16:45:41 crc kubenswrapper[4886]: I0129 16:45:41.387822 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-qnmmn" event={"ID":"bd8dc819-215b-44f5-b758-9bac32be60f5","Type":"ContainerStarted","Data":"cb9480145b48c1c160d565f2702f69ad12d158e1ef85b91a82e365f071052f0f"} Jan 29 16:45:50 crc kubenswrapper[4886]: I0129 16:45:50.465176 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-qnmmn" event={"ID":"bd8dc819-215b-44f5-b758-9bac32be60f5","Type":"ContainerStarted","Data":"00f68f7f911c02ad1310aafa23adbce23e7c17489ab5225b4c7ab5fedca83995"} Jan 29 16:45:50 crc kubenswrapper[4886]: I0129 16:45:50.506890 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-qnmmn" podStartSLOduration=2.511114523 podStartE2EDuration="10.506859075s" podCreationTimestamp="2026-01-29 16:45:40 +0000 UTC" firstStartedPulling="2026-01-29 16:45:41.281141327 +0000 UTC m=+1424.189860609" lastFinishedPulling="2026-01-29 16:45:49.276885889 +0000 UTC m=+1432.185605161" observedRunningTime="2026-01-29 16:45:50.497789052 +0000 UTC m=+1433.406508374" watchObservedRunningTime="2026-01-29 16:45:50.506859075 +0000 UTC m=+1433.415578377" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.116308 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s4tkp"] Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.120132 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.129413 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4tkp"] Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.237501 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvhpt\" (UniqueName: \"kubernetes.io/projected/70fc38f3-74c0-462d-9ad2-60f109b2d365-kube-api-access-bvhpt\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.237577 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-catalog-content\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.237648 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-utilities\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.339915 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-utilities\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.340370 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvhpt\" (UniqueName: \"kubernetes.io/projected/70fc38f3-74c0-462d-9ad2-60f109b2d365-kube-api-access-bvhpt\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.340521 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-utilities\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.340677 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-catalog-content\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.340909 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-catalog-content\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.365635 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvhpt\" (UniqueName: \"kubernetes.io/projected/70fc38f3-74c0-462d-9ad2-60f109b2d365-kube-api-access-bvhpt\") pod \"redhat-marketplace-s4tkp\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.462801 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:16 crc kubenswrapper[4886]: I0129 16:46:16.915596 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4tkp"] Jan 29 16:46:17 crc kubenswrapper[4886]: I0129 16:46:17.738820 4886 generic.go:334] "Generic (PLEG): container finished" podID="70fc38f3-74c0-462d-9ad2-60f109b2d365" containerID="a6ec04dedfc222e2930d911f7475d986731b7050751d92e32b232da84ad7a329" exitCode=0 Jan 29 16:46:17 crc kubenswrapper[4886]: I0129 16:46:17.738945 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4tkp" event={"ID":"70fc38f3-74c0-462d-9ad2-60f109b2d365","Type":"ContainerDied","Data":"a6ec04dedfc222e2930d911f7475d986731b7050751d92e32b232da84ad7a329"} Jan 29 16:46:17 crc kubenswrapper[4886]: I0129 16:46:17.739050 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4tkp" event={"ID":"70fc38f3-74c0-462d-9ad2-60f109b2d365","Type":"ContainerStarted","Data":"fc5358167411608003143a7e9911eec6e0a3a3cefade8c9902a65d696f96288f"} Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.570949 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.580490 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2wln4n"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.589901 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.598182 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bn8v4t"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.607145 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.643206 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b00b2947-6947-4d0a-b2d9-42adefd8ebb3" path="/var/lib/kubelet/pods/b00b2947-6947-4d0a-b2d9-42adefd8ebb3/volumes" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.644241 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c5874b-97c3-4f3e-8e88-68c3653a6c4a" path="/var/lib/kubelet/pods/e6c5874b-97c3-4f3e-8e88-68c3653a6c4a/volumes" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.644829 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08c2snz"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.644859 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfv6k"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.644879 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q5hs7"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.644889 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtk7r"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.645060 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" podUID="42b8dc70-b29d-4995-9727-9b8e032bdad9" containerName="marketplace-operator" containerID="cri-o://f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e" gracePeriod=30 Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.645254 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jfv6k" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" containerName="registry-server" containerID="cri-o://735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef" gracePeriod=30 Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.645717 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q5hs7" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerName="registry-server" containerID="cri-o://efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e" gracePeriod=30 Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.655027 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qbl4"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.656455 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4qbl4" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerName="registry-server" containerID="cri-o://26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d" gracePeriod=30 Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.664571 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4tkp"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.673498 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m8snn"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.675118 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.680168 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkk68"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.680477 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zkk68" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="registry-server" containerID="cri-o://29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990" gracePeriod=30 Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.686795 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m8snn"] Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.689006 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.689103 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.689128 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tz68\" (UniqueName: \"kubernetes.io/projected/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-kube-api-access-6tz68\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.746450 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4tkp" event={"ID":"70fc38f3-74c0-462d-9ad2-60f109b2d365","Type":"ContainerStarted","Data":"cd0174e3243b8d22b133a543427ce03858c997e6e589bac4aa5cc61f6f83f38c"} Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.790159 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.790223 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tz68\" (UniqueName: \"kubernetes.io/projected/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-kube-api-access-6tz68\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.790310 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.791981 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.801749 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:18 crc kubenswrapper[4886]: I0129 16:46:18.811392 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tz68\" (UniqueName: \"kubernetes.io/projected/9cb13d4a-3940-45ef-9135-ff94c6a75b0c-kube-api-access-6tz68\") pod \"marketplace-operator-79b997595-m8snn\" (UID: \"9cb13d4a-3940-45ef-9135-ff94c6a75b0c\") " pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.196307 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.197602 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.209152 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.212221 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.234785 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.248486 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402122 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-trusted-ca\") pod \"42b8dc70-b29d-4995-9727-9b8e032bdad9\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402435 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-operator-metrics\") pod \"42b8dc70-b29d-4995-9727-9b8e032bdad9\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402463 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-catalog-content\") pod \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402502 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-catalog-content\") pod \"69003a39-1c09-4087-a494-ebfd69e973cf\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402521 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-utilities\") pod \"a7325ad0-28bf-45e0-bbd5-160f441de091\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402564 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mlnk\" (UniqueName: \"kubernetes.io/projected/69003a39-1c09-4087-a494-ebfd69e973cf-kube-api-access-5mlnk\") pod \"69003a39-1c09-4087-a494-ebfd69e973cf\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402615 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-catalog-content\") pod \"a7325ad0-28bf-45e0-bbd5-160f441de091\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402637 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-utilities\") pod \"69003a39-1c09-4087-a494-ebfd69e973cf\" (UID: \"69003a39-1c09-4087-a494-ebfd69e973cf\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402921 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn92n\" (UniqueName: \"kubernetes.io/projected/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-kube-api-access-vn92n\") pod \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402943 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-utilities\") pod \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402971 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzm6k\" (UniqueName: \"kubernetes.io/projected/42b8dc70-b29d-4995-9727-9b8e032bdad9-kube-api-access-pzm6k\") pod \"42b8dc70-b29d-4995-9727-9b8e032bdad9\" (UID: \"42b8dc70-b29d-4995-9727-9b8e032bdad9\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402987 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-catalog-content\") pod \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.403120 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-utilities\") pod \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\" (UID: \"d84ce3e9-c41a-4a08-8d86-2a918d5e9450\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.403147 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8jsj\" (UniqueName: \"kubernetes.io/projected/a7325ad0-28bf-45e0-bbd5-160f441de091-kube-api-access-c8jsj\") pod \"a7325ad0-28bf-45e0-bbd5-160f441de091\" (UID: \"a7325ad0-28bf-45e0-bbd5-160f441de091\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.403170 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf7sq\" (UniqueName: \"kubernetes.io/projected/57aa9115-b2d5-45aa-8ac3-e251c0907e45-kube-api-access-vf7sq\") pod \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\" (UID: \"57aa9115-b2d5-45aa-8ac3-e251c0907e45\") " Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.402707 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "42b8dc70-b29d-4995-9727-9b8e032bdad9" (UID: "42b8dc70-b29d-4995-9727-9b8e032bdad9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.403418 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-utilities" (OuterVolumeSpecName: "utilities") pod "a7325ad0-28bf-45e0-bbd5-160f441de091" (UID: "a7325ad0-28bf-45e0-bbd5-160f441de091"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.406209 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "42b8dc70-b29d-4995-9727-9b8e032bdad9" (UID: "42b8dc70-b29d-4995-9727-9b8e032bdad9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.407130 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-utilities" (OuterVolumeSpecName: "utilities") pod "d84ce3e9-c41a-4a08-8d86-2a918d5e9450" (UID: "d84ce3e9-c41a-4a08-8d86-2a918d5e9450"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.410042 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7325ad0-28bf-45e0-bbd5-160f441de091-kube-api-access-c8jsj" (OuterVolumeSpecName: "kube-api-access-c8jsj") pod "a7325ad0-28bf-45e0-bbd5-160f441de091" (UID: "a7325ad0-28bf-45e0-bbd5-160f441de091"). InnerVolumeSpecName "kube-api-access-c8jsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.410110 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57aa9115-b2d5-45aa-8ac3-e251c0907e45-kube-api-access-vf7sq" (OuterVolumeSpecName: "kube-api-access-vf7sq") pod "57aa9115-b2d5-45aa-8ac3-e251c0907e45" (UID: "57aa9115-b2d5-45aa-8ac3-e251c0907e45"). InnerVolumeSpecName "kube-api-access-vf7sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.410795 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-utilities" (OuterVolumeSpecName: "utilities") pod "69003a39-1c09-4087-a494-ebfd69e973cf" (UID: "69003a39-1c09-4087-a494-ebfd69e973cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.412082 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-utilities" (OuterVolumeSpecName: "utilities") pod "57aa9115-b2d5-45aa-8ac3-e251c0907e45" (UID: "57aa9115-b2d5-45aa-8ac3-e251c0907e45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.413945 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-kube-api-access-vn92n" (OuterVolumeSpecName: "kube-api-access-vn92n") pod "d84ce3e9-c41a-4a08-8d86-2a918d5e9450" (UID: "d84ce3e9-c41a-4a08-8d86-2a918d5e9450"). InnerVolumeSpecName "kube-api-access-vn92n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.414651 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b8dc70-b29d-4995-9727-9b8e032bdad9-kube-api-access-pzm6k" (OuterVolumeSpecName: "kube-api-access-pzm6k") pod "42b8dc70-b29d-4995-9727-9b8e032bdad9" (UID: "42b8dc70-b29d-4995-9727-9b8e032bdad9"). InnerVolumeSpecName "kube-api-access-pzm6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.420814 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69003a39-1c09-4087-a494-ebfd69e973cf-kube-api-access-5mlnk" (OuterVolumeSpecName: "kube-api-access-5mlnk") pod "69003a39-1c09-4087-a494-ebfd69e973cf" (UID: "69003a39-1c09-4087-a494-ebfd69e973cf"). InnerVolumeSpecName "kube-api-access-5mlnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.439576 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57aa9115-b2d5-45aa-8ac3-e251c0907e45" (UID: "57aa9115-b2d5-45aa-8ac3-e251c0907e45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.459026 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69003a39-1c09-4087-a494-ebfd69e973cf" (UID: "69003a39-1c09-4087-a494-ebfd69e973cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.461427 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7325ad0-28bf-45e0-bbd5-160f441de091" (UID: "a7325ad0-28bf-45e0-bbd5-160f441de091"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504634 4886 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504680 4886 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42b8dc70-b29d-4995-9727-9b8e032bdad9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504698 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504711 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504725 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mlnk\" (UniqueName: \"kubernetes.io/projected/69003a39-1c09-4087-a494-ebfd69e973cf-kube-api-access-5mlnk\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504737 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7325ad0-28bf-45e0-bbd5-160f441de091-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504748 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69003a39-1c09-4087-a494-ebfd69e973cf-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504760 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn92n\" (UniqueName: \"kubernetes.io/projected/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-kube-api-access-vn92n\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504771 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504782 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzm6k\" (UniqueName: \"kubernetes.io/projected/42b8dc70-b29d-4995-9727-9b8e032bdad9-kube-api-access-pzm6k\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504794 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aa9115-b2d5-45aa-8ac3-e251c0907e45-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504806 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504817 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8jsj\" (UniqueName: \"kubernetes.io/projected/a7325ad0-28bf-45e0-bbd5-160f441de091-kube-api-access-c8jsj\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.504829 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf7sq\" (UniqueName: \"kubernetes.io/projected/57aa9115-b2d5-45aa-8ac3-e251c0907e45-kube-api-access-vf7sq\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.521773 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d84ce3e9-c41a-4a08-8d86-2a918d5e9450" (UID: "d84ce3e9-c41a-4a08-8d86-2a918d5e9450"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.605737 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84ce3e9-c41a-4a08-8d86-2a918d5e9450-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.697647 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m8snn"] Jan 29 16:46:19 crc kubenswrapper[4886]: W0129 16:46:19.700737 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb13d4a_3940_45ef_9135_ff94c6a75b0c.slice/crio-7413b62657ae27eb3cf801eb842106f18c56c183ec06f3f9275517ece6cc636b WatchSource:0}: Error finding container 7413b62657ae27eb3cf801eb842106f18c56c183ec06f3f9275517ece6cc636b: Status 404 returned error can't find the container with id 7413b62657ae27eb3cf801eb842106f18c56c183ec06f3f9275517ece6cc636b Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.755238 4886 generic.go:334] "Generic (PLEG): container finished" podID="70fc38f3-74c0-462d-9ad2-60f109b2d365" containerID="cd0174e3243b8d22b133a543427ce03858c997e6e589bac4aa5cc61f6f83f38c" exitCode=0 Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.755314 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4tkp" event={"ID":"70fc38f3-74c0-462d-9ad2-60f109b2d365","Type":"ContainerDied","Data":"cd0174e3243b8d22b133a543427ce03858c997e6e589bac4aa5cc61f6f83f38c"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.758479 4886 generic.go:334] "Generic (PLEG): container finished" podID="42b8dc70-b29d-4995-9727-9b8e032bdad9" containerID="f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e" exitCode=0 Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.758595 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" event={"ID":"42b8dc70-b29d-4995-9727-9b8e032bdad9","Type":"ContainerDied","Data":"f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.758621 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" event={"ID":"42b8dc70-b29d-4995-9727-9b8e032bdad9","Type":"ContainerDied","Data":"648bc592f49ae3cedaf90d37922cbc1e1495121ad8e957f81f4908846b5e05da"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.758623 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtk7r" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.758648 4886 scope.go:117] "RemoveContainer" containerID="f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.768933 4886 generic.go:334] "Generic (PLEG): container finished" podID="69003a39-1c09-4087-a494-ebfd69e973cf" containerID="735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef" exitCode=0 Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.769022 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfv6k" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.768983 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfv6k" event={"ID":"69003a39-1c09-4087-a494-ebfd69e973cf","Type":"ContainerDied","Data":"735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.769095 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfv6k" event={"ID":"69003a39-1c09-4087-a494-ebfd69e973cf","Type":"ContainerDied","Data":"e4d88167fe4815cd042b435714fee0326b8557c7e5fb2b46e9557a042ac995f8"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.784539 4886 generic.go:334] "Generic (PLEG): container finished" podID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerID="29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990" exitCode=0 Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.784664 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkk68" event={"ID":"d84ce3e9-c41a-4a08-8d86-2a918d5e9450","Type":"ContainerDied","Data":"29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.784709 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkk68" event={"ID":"d84ce3e9-c41a-4a08-8d86-2a918d5e9450","Type":"ContainerDied","Data":"1de9e48715ad861e4d8bd78cecc12c2dcf52cdf92d4274338ddeebf931d7420d"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.784861 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkk68" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.794009 4886 scope.go:117] "RemoveContainer" containerID="f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.794444 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" event={"ID":"9cb13d4a-3940-45ef-9135-ff94c6a75b0c","Type":"ContainerStarted","Data":"7413b62657ae27eb3cf801eb842106f18c56c183ec06f3f9275517ece6cc636b"} Jan 29 16:46:19 crc kubenswrapper[4886]: E0129 16:46:19.794787 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e\": container with ID starting with f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e not found: ID does not exist" containerID="f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.794936 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e"} err="failed to get container status \"f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e\": rpc error: code = NotFound desc = could not find container \"f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e\": container with ID starting with f67a42038126009d6221ae06e997c4b3a4d04b56f64c29fbc910653a5611145e not found: ID does not exist" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.794970 4886 scope.go:117] "RemoveContainer" containerID="735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.800895 4886 generic.go:334] "Generic (PLEG): container finished" podID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerID="efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e" exitCode=0 Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.800989 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5hs7" event={"ID":"a7325ad0-28bf-45e0-bbd5-160f441de091","Type":"ContainerDied","Data":"efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.801016 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5hs7" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.801049 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5hs7" event={"ID":"a7325ad0-28bf-45e0-bbd5-160f441de091","Type":"ContainerDied","Data":"58e358a0eb4540bb049b243d60b0ba858eec19efdffef34538e1bbcdff0edbc6"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.805565 4886 generic.go:334] "Generic (PLEG): container finished" podID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerID="26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d" exitCode=0 Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.805659 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qbl4" event={"ID":"57aa9115-b2d5-45aa-8ac3-e251c0907e45","Type":"ContainerDied","Data":"26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.805716 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qbl4" event={"ID":"57aa9115-b2d5-45aa-8ac3-e251c0907e45","Type":"ContainerDied","Data":"68d81ee76eccd615ba9046c4c1e6648df9ef22ce6eee6d566d9309dd619e6010"} Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.805654 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qbl4" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.825276 4886 scope.go:117] "RemoveContainer" containerID="9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.836438 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtk7r"] Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.841293 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtk7r"] Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.895265 4886 scope.go:117] "RemoveContainer" containerID="9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.911939 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q5hs7"] Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.921943 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q5hs7"] Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.933446 4886 scope.go:117] "RemoveContainer" containerID="735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.934292 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfv6k"] Jan 29 16:46:19 crc kubenswrapper[4886]: E0129 16:46:19.938536 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef\": container with ID starting with 735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef not found: ID does not exist" containerID="735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.938569 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef"} err="failed to get container status \"735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef\": rpc error: code = NotFound desc = could not find container \"735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef\": container with ID starting with 735ad1f3c641d99dc2e721ad33c111100670ea307d45a8bb7eba837fe9c269ef not found: ID does not exist" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.938592 4886 scope.go:117] "RemoveContainer" containerID="9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f" Jan 29 16:46:19 crc kubenswrapper[4886]: E0129 16:46:19.939856 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f\": container with ID starting with 9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f not found: ID does not exist" containerID="9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.945434 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jfv6k"] Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.948242 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f"} err="failed to get container status \"9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f\": rpc error: code = NotFound desc = could not find container \"9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f\": container with ID starting with 9bd48ab4996ca74fa989778e83dba86fbb2f2ad2104534befcf501673ddd232f not found: ID does not exist" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.948321 4886 scope.go:117] "RemoveContainer" containerID="9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.948496 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkk68"] Jan 29 16:46:19 crc kubenswrapper[4886]: E0129 16:46:19.949051 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647\": container with ID starting with 9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647 not found: ID does not exist" containerID="9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.949082 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647"} err="failed to get container status \"9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647\": rpc error: code = NotFound desc = could not find container \"9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647\": container with ID starting with 9dc94c69454cda473e048b5be83a123e92e3d4dcc0206e5c91ebde5e727d2647 not found: ID does not exist" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.949101 4886 scope.go:117] "RemoveContainer" containerID="29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.952947 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zkk68"] Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.958013 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qbl4"] Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.961573 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qbl4"] Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.964169 4886 scope.go:117] "RemoveContainer" containerID="0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.980341 4886 scope.go:117] "RemoveContainer" containerID="9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.994424 4886 scope.go:117] "RemoveContainer" containerID="29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990" Jan 29 16:46:19 crc kubenswrapper[4886]: E0129 16:46:19.994996 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990\": container with ID starting with 29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990 not found: ID does not exist" containerID="29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.995113 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990"} err="failed to get container status \"29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990\": rpc error: code = NotFound desc = could not find container \"29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990\": container with ID starting with 29f7d7e31f9e12ad7f76231137a2e9a61ff5af739a92e0ab7f9fef0c87106990 not found: ID does not exist" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.995155 4886 scope.go:117] "RemoveContainer" containerID="0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c" Jan 29 16:46:19 crc kubenswrapper[4886]: E0129 16:46:19.995483 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c\": container with ID starting with 0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c not found: ID does not exist" containerID="0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.995509 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c"} err="failed to get container status \"0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c\": rpc error: code = NotFound desc = could not find container \"0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c\": container with ID starting with 0fa864e4732d0bb9a1a68d7843a62bc56027d9ccdfea2ad23148f5d87b7ecd0c not found: ID does not exist" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.995529 4886 scope.go:117] "RemoveContainer" containerID="9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6" Jan 29 16:46:19 crc kubenswrapper[4886]: E0129 16:46:19.995732 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6\": container with ID starting with 9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6 not found: ID does not exist" containerID="9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.995753 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6"} err="failed to get container status \"9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6\": rpc error: code = NotFound desc = could not find container \"9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6\": container with ID starting with 9771013e1661afa4b7f2a5038c24d8397533ccd7c529146bb8fb2adf4c78bad6 not found: ID does not exist" Jan 29 16:46:19 crc kubenswrapper[4886]: I0129 16:46:19.995766 4886 scope.go:117] "RemoveContainer" containerID="efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.012628 4886 scope.go:117] "RemoveContainer" containerID="35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.014361 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.035185 4886 scope.go:117] "RemoveContainer" containerID="bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.056014 4886 scope.go:117] "RemoveContainer" containerID="efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.056932 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e\": container with ID starting with efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e not found: ID does not exist" containerID="efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.056989 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e"} err="failed to get container status \"efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e\": rpc error: code = NotFound desc = could not find container \"efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e\": container with ID starting with efe76a3e970848dc3228f84915fb95af5f8ed14f0bcb5b641221638cab0f714e not found: ID does not exist" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.057019 4886 scope.go:117] "RemoveContainer" containerID="35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.057589 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88\": container with ID starting with 35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88 not found: ID does not exist" containerID="35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.057654 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88"} err="failed to get container status \"35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88\": rpc error: code = NotFound desc = could not find container \"35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88\": container with ID starting with 35212758091bf8c3d45fb0a080810d5fded73e71ef6c555edea92ef2d2dcec88 not found: ID does not exist" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.057706 4886 scope.go:117] "RemoveContainer" containerID="bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.058275 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d\": container with ID starting with bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d not found: ID does not exist" containerID="bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.058304 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d"} err="failed to get container status \"bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d\": rpc error: code = NotFound desc = could not find container \"bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d\": container with ID starting with bd8b45bdbc53c5a19f5d9b16c77f16088c5159f9cfac3b1dd35c0f4cdab8672d not found: ID does not exist" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.058394 4886 scope.go:117] "RemoveContainer" containerID="26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.077581 4886 scope.go:117] "RemoveContainer" containerID="d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.093561 4886 scope.go:117] "RemoveContainer" containerID="9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.111420 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-catalog-content\") pod \"70fc38f3-74c0-462d-9ad2-60f109b2d365\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.111449 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-utilities\") pod \"70fc38f3-74c0-462d-9ad2-60f109b2d365\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.111470 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvhpt\" (UniqueName: \"kubernetes.io/projected/70fc38f3-74c0-462d-9ad2-60f109b2d365-kube-api-access-bvhpt\") pod \"70fc38f3-74c0-462d-9ad2-60f109b2d365\" (UID: \"70fc38f3-74c0-462d-9ad2-60f109b2d365\") " Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.111472 4886 scope.go:117] "RemoveContainer" containerID="26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.111809 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d\": container with ID starting with 26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d not found: ID does not exist" containerID="26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.111861 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d"} err="failed to get container status \"26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d\": rpc error: code = NotFound desc = could not find container \"26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d\": container with ID starting with 26900ab338bee6799e69566c733a5063575a2c6eeacf71f0f523248ae71b1b2d not found: ID does not exist" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.111878 4886 scope.go:117] "RemoveContainer" containerID="d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.112227 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4\": container with ID starting with d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4 not found: ID does not exist" containerID="d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.112241 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4"} err="failed to get container status \"d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4\": rpc error: code = NotFound desc = could not find container \"d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4\": container with ID starting with d611665f3c9d008d6e151d05993039687945f7572ec764930a3d9ccea183c1b4 not found: ID does not exist" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.112254 4886 scope.go:117] "RemoveContainer" containerID="9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.112493 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1\": container with ID starting with 9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1 not found: ID does not exist" containerID="9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.112510 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1"} err="failed to get container status \"9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1\": rpc error: code = NotFound desc = could not find container \"9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1\": container with ID starting with 9483d17c90afb2d261251cb57ed87c956106b0b7bb964afcffdf0a2d1b5b13c1 not found: ID does not exist" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.114208 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-utilities" (OuterVolumeSpecName: "utilities") pod "70fc38f3-74c0-462d-9ad2-60f109b2d365" (UID: "70fc38f3-74c0-462d-9ad2-60f109b2d365"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.118086 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70fc38f3-74c0-462d-9ad2-60f109b2d365-kube-api-access-bvhpt" (OuterVolumeSpecName: "kube-api-access-bvhpt") pod "70fc38f3-74c0-462d-9ad2-60f109b2d365" (UID: "70fc38f3-74c0-462d-9ad2-60f109b2d365"). InnerVolumeSpecName "kube-api-access-bvhpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.150408 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70fc38f3-74c0-462d-9ad2-60f109b2d365" (UID: "70fc38f3-74c0-462d-9ad2-60f109b2d365"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.212265 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.212299 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70fc38f3-74c0-462d-9ad2-60f109b2d365-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.212309 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvhpt\" (UniqueName: \"kubernetes.io/projected/70fc38f3-74c0-462d-9ad2-60f109b2d365-kube-api-access-bvhpt\") on node \"crc\" DevicePath \"\"" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.633443 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a67e3b-3393-4dea-81c8-42c2e22ad315" path="/var/lib/kubelet/pods/20a67e3b-3393-4dea-81c8-42c2e22ad315/volumes" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.634790 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b8dc70-b29d-4995-9727-9b8e032bdad9" path="/var/lib/kubelet/pods/42b8dc70-b29d-4995-9727-9b8e032bdad9/volumes" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.635774 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" path="/var/lib/kubelet/pods/57aa9115-b2d5-45aa-8ac3-e251c0907e45/volumes" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.637756 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" path="/var/lib/kubelet/pods/69003a39-1c09-4087-a494-ebfd69e973cf/volumes" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.638981 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" path="/var/lib/kubelet/pods/a7325ad0-28bf-45e0-bbd5-160f441de091/volumes" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.641002 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" path="/var/lib/kubelet/pods/d84ce3e9-c41a-4a08-8d86-2a918d5e9450/volumes" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.818646 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" event={"ID":"9cb13d4a-3940-45ef-9135-ff94c6a75b0c","Type":"ContainerStarted","Data":"b1c2b8fd07bb7f6da16b71e8f971678bad1efd8c3f30512159a263059ee2d77a"} Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.818890 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.824592 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4tkp" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.824863 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4tkp" event={"ID":"70fc38f3-74c0-462d-9ad2-60f109b2d365","Type":"ContainerDied","Data":"fc5358167411608003143a7e9911eec6e0a3a3cefade8c9902a65d696f96288f"} Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.824940 4886 scope.go:117] "RemoveContainer" containerID="cd0174e3243b8d22b133a543427ce03858c997e6e589bac4aa5cc61f6f83f38c" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.828227 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.865754 4886 scope.go:117] "RemoveContainer" containerID="a6ec04dedfc222e2930d911f7475d986731b7050751d92e32b232da84ad7a329" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.882843 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m8snn" podStartSLOduration=2.882820359 podStartE2EDuration="2.882820359s" podCreationTimestamp="2026-01-29 16:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:46:20.857045806 +0000 UTC m=+1463.765765128" watchObservedRunningTime="2026-01-29 16:46:20.882820359 +0000 UTC m=+1463.791539641" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.883867 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ws2lm"] Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884171 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884188 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884197 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884205 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884221 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884228 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884243 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b8dc70-b29d-4995-9727-9b8e032bdad9" containerName="marketplace-operator" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884252 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b8dc70-b29d-4995-9727-9b8e032bdad9" containerName="marketplace-operator" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884261 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884270 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884281 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884288 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884301 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884307 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884316 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70fc38f3-74c0-462d-9ad2-60f109b2d365" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884340 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fc38f3-74c0-462d-9ad2-60f109b2d365" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884351 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884358 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884365 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884372 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884389 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884397 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="extract-utilities" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884406 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884413 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884426 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884433 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884631 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70fc38f3-74c0-462d-9ad2-60f109b2d365" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884638 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fc38f3-74c0-462d-9ad2-60f109b2d365" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: E0129 16:46:20.884645 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884652 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884806 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aa9115-b2d5-45aa-8ac3-e251c0907e45" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884819 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="70fc38f3-74c0-462d-9ad2-60f109b2d365" containerName="extract-content" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884833 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b8dc70-b29d-4995-9727-9b8e032bdad9" containerName="marketplace-operator" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884843 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="69003a39-1c09-4087-a494-ebfd69e973cf" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884853 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d84ce3e9-c41a-4a08-8d86-2a918d5e9450" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.884863 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7325ad0-28bf-45e0-bbd5-160f441de091" containerName="registry-server" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.886024 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.892786 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.922887 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-catalog-content\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.923236 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwj9l\" (UniqueName: \"kubernetes.io/projected/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-kube-api-access-vwj9l\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.923276 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-utilities\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.927763 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ws2lm"] Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.942373 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4tkp"] Jan 29 16:46:20 crc kubenswrapper[4886]: I0129 16:46:20.948647 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4tkp"] Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.023995 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwj9l\" (UniqueName: \"kubernetes.io/projected/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-kube-api-access-vwj9l\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.024052 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-utilities\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.024097 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-catalog-content\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.024772 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-utilities\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.024788 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-catalog-content\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.040593 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwj9l\" (UniqueName: \"kubernetes.io/projected/d8ab6536-f9ab-4191-9c15-f3fe0453e7d0-kube-api-access-vwj9l\") pod \"certified-operators-ws2lm\" (UID: \"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0\") " pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.254575 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.720592 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ws2lm"] Jan 29 16:46:21 crc kubenswrapper[4886]: I0129 16:46:21.836884 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws2lm" event={"ID":"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0","Type":"ContainerStarted","Data":"3e21e164e499c1d13413fe08e994414a06b124f6e168c863d9cce408a4c23cd1"} Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.287101 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vnttp"] Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.288364 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.291294 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.299283 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnttp"] Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.357073 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzhg\" (UniqueName: \"kubernetes.io/projected/fbfc768f-4803-4f4e-9019-2aacda68bc47-kube-api-access-4xzhg\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.357157 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbfc768f-4803-4f4e-9019-2aacda68bc47-catalog-content\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.357188 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbfc768f-4803-4f4e-9019-2aacda68bc47-utilities\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.458974 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbfc768f-4803-4f4e-9019-2aacda68bc47-catalog-content\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.459046 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbfc768f-4803-4f4e-9019-2aacda68bc47-utilities\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.459143 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzhg\" (UniqueName: \"kubernetes.io/projected/fbfc768f-4803-4f4e-9019-2aacda68bc47-kube-api-access-4xzhg\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.460093 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbfc768f-4803-4f4e-9019-2aacda68bc47-catalog-content\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.460102 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbfc768f-4803-4f4e-9019-2aacda68bc47-utilities\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.483597 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzhg\" (UniqueName: \"kubernetes.io/projected/fbfc768f-4803-4f4e-9019-2aacda68bc47-kube-api-access-4xzhg\") pod \"community-operators-vnttp\" (UID: \"fbfc768f-4803-4f4e-9019-2aacda68bc47\") " pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.615897 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.627314 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70fc38f3-74c0-462d-9ad2-60f109b2d365" path="/var/lib/kubelet/pods/70fc38f3-74c0-462d-9ad2-60f109b2d365/volumes" Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.846881 4886 generic.go:334] "Generic (PLEG): container finished" podID="d8ab6536-f9ab-4191-9c15-f3fe0453e7d0" containerID="039d2652c8a0923a767a8f904be9db7661ebaebd943eeea44963f20c2ca8a4e7" exitCode=0 Jan 29 16:46:22 crc kubenswrapper[4886]: I0129 16:46:22.846916 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws2lm" event={"ID":"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0","Type":"ContainerDied","Data":"039d2652c8a0923a767a8f904be9db7661ebaebd943eeea44963f20c2ca8a4e7"} Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.067209 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnttp"] Jan 29 16:46:23 crc kubenswrapper[4886]: W0129 16:46:23.078471 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbfc768f_4803_4f4e_9019_2aacda68bc47.slice/crio-d40c7ee3bf4d4b4b9d77673f8aaefd16f5cb607897cbf316986478e281bb9b0e WatchSource:0}: Error finding container d40c7ee3bf4d4b4b9d77673f8aaefd16f5cb607897cbf316986478e281bb9b0e: Status 404 returned error can't find the container with id d40c7ee3bf4d4b4b9d77673f8aaefd16f5cb607897cbf316986478e281bb9b0e Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.288402 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6bdhs"] Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.290405 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.293410 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.300463 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6bdhs"] Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.478627 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e49770-fa31-4780-a5ac-38a6bc1221a9-catalog-content\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.478703 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qhg\" (UniqueName: \"kubernetes.io/projected/80e49770-fa31-4780-a5ac-38a6bc1221a9-kube-api-access-65qhg\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.478749 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e49770-fa31-4780-a5ac-38a6bc1221a9-utilities\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.580770 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e49770-fa31-4780-a5ac-38a6bc1221a9-catalog-content\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.580876 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65qhg\" (UniqueName: \"kubernetes.io/projected/80e49770-fa31-4780-a5ac-38a6bc1221a9-kube-api-access-65qhg\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.580961 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e49770-fa31-4780-a5ac-38a6bc1221a9-utilities\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.581717 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e49770-fa31-4780-a5ac-38a6bc1221a9-catalog-content\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.581741 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e49770-fa31-4780-a5ac-38a6bc1221a9-utilities\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.609770 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qhg\" (UniqueName: \"kubernetes.io/projected/80e49770-fa31-4780-a5ac-38a6bc1221a9-kube-api-access-65qhg\") pod \"redhat-operators-6bdhs\" (UID: \"80e49770-fa31-4780-a5ac-38a6bc1221a9\") " pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.615020 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:23 crc kubenswrapper[4886]: I0129 16:46:23.857486 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnttp" event={"ID":"fbfc768f-4803-4f4e-9019-2aacda68bc47","Type":"ContainerStarted","Data":"d40c7ee3bf4d4b4b9d77673f8aaefd16f5cb607897cbf316986478e281bb9b0e"} Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.142991 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6bdhs"] Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.681685 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-52bfx"] Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.683297 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.686187 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.687521 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52bfx"] Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.799637 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-utilities\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.799946 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7694n\" (UniqueName: \"kubernetes.io/projected/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-kube-api-access-7694n\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.799982 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-catalog-content\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.868535 4886 generic.go:334] "Generic (PLEG): container finished" podID="fbfc768f-4803-4f4e-9019-2aacda68bc47" containerID="d660e8ba51141212057357f1c6afcfdf2f206393e2a4f6b098221cfd1be48212" exitCode=0 Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.868606 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnttp" event={"ID":"fbfc768f-4803-4f4e-9019-2aacda68bc47","Type":"ContainerDied","Data":"d660e8ba51141212057357f1c6afcfdf2f206393e2a4f6b098221cfd1be48212"} Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.870975 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bdhs" event={"ID":"80e49770-fa31-4780-a5ac-38a6bc1221a9","Type":"ContainerStarted","Data":"bb10669d5c9319d3f6b647732aa83aaed3939b3c1381053c9f2eca3c370d3282"} Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.870999 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bdhs" event={"ID":"80e49770-fa31-4780-a5ac-38a6bc1221a9","Type":"ContainerStarted","Data":"0a0fa418ed3ea00bd740848278269fe5bbbe31cf0912ca198a306059478ec782"} Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.901245 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7694n\" (UniqueName: \"kubernetes.io/projected/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-kube-api-access-7694n\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.901307 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-catalog-content\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.901487 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-utilities\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.902255 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-utilities\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.902414 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-catalog-content\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:24 crc kubenswrapper[4886]: I0129 16:46:24.919005 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7694n\" (UniqueName: \"kubernetes.io/projected/87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca-kube-api-access-7694n\") pod \"redhat-marketplace-52bfx\" (UID: \"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca\") " pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:25 crc kubenswrapper[4886]: I0129 16:46:25.049750 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:25 crc kubenswrapper[4886]: I0129 16:46:25.512398 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52bfx"] Jan 29 16:46:25 crc kubenswrapper[4886]: W0129 16:46:25.524712 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b65e80_b30f_4ac4_bb06_ec8eb04cd7ca.slice/crio-2fa2b8c2ffecf9cfabe0d29fe2ec3fcc727cf17a5638653e0dda06d83e26ae2e WatchSource:0}: Error finding container 2fa2b8c2ffecf9cfabe0d29fe2ec3fcc727cf17a5638653e0dda06d83e26ae2e: Status 404 returned error can't find the container with id 2fa2b8c2ffecf9cfabe0d29fe2ec3fcc727cf17a5638653e0dda06d83e26ae2e Jan 29 16:46:25 crc kubenswrapper[4886]: I0129 16:46:25.879128 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bfx" event={"ID":"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca","Type":"ContainerStarted","Data":"2fa2b8c2ffecf9cfabe0d29fe2ec3fcc727cf17a5638653e0dda06d83e26ae2e"} Jan 29 16:46:25 crc kubenswrapper[4886]: I0129 16:46:25.881045 4886 generic.go:334] "Generic (PLEG): container finished" podID="80e49770-fa31-4780-a5ac-38a6bc1221a9" containerID="bb10669d5c9319d3f6b647732aa83aaed3939b3c1381053c9f2eca3c370d3282" exitCode=0 Jan 29 16:46:25 crc kubenswrapper[4886]: I0129 16:46:25.881075 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bdhs" event={"ID":"80e49770-fa31-4780-a5ac-38a6bc1221a9","Type":"ContainerDied","Data":"bb10669d5c9319d3f6b647732aa83aaed3939b3c1381053c9f2eca3c370d3282"} Jan 29 16:46:26 crc kubenswrapper[4886]: I0129 16:46:26.892245 4886 generic.go:334] "Generic (PLEG): container finished" podID="87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca" containerID="465a7f1f1f8324da6688bb49b19359ff8dfdf2d01808f80da09155338e2c3325" exitCode=0 Jan 29 16:46:26 crc kubenswrapper[4886]: I0129 16:46:26.892346 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bfx" event={"ID":"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca","Type":"ContainerDied","Data":"465a7f1f1f8324da6688bb49b19359ff8dfdf2d01808f80da09155338e2c3325"} Jan 29 16:46:32 crc kubenswrapper[4886]: I0129 16:46:32.955538 4886 generic.go:334] "Generic (PLEG): container finished" podID="80e49770-fa31-4780-a5ac-38a6bc1221a9" containerID="678b6453290fdf5637a9f4f9fc3768a75a11de08b2393b56c97065c9afb6c6c5" exitCode=0 Jan 29 16:46:32 crc kubenswrapper[4886]: I0129 16:46:32.955578 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bdhs" event={"ID":"80e49770-fa31-4780-a5ac-38a6bc1221a9","Type":"ContainerDied","Data":"678b6453290fdf5637a9f4f9fc3768a75a11de08b2393b56c97065c9afb6c6c5"} Jan 29 16:46:32 crc kubenswrapper[4886]: I0129 16:46:32.959905 4886 generic.go:334] "Generic (PLEG): container finished" podID="87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca" containerID="c271f5517b1a393ad7a319989ad78bb14460e266f8b7d0dd30fa11b2117eed12" exitCode=0 Jan 29 16:46:32 crc kubenswrapper[4886]: I0129 16:46:32.960022 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bfx" event={"ID":"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca","Type":"ContainerDied","Data":"c271f5517b1a393ad7a319989ad78bb14460e266f8b7d0dd30fa11b2117eed12"} Jan 29 16:46:32 crc kubenswrapper[4886]: I0129 16:46:32.964729 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnttp" event={"ID":"fbfc768f-4803-4f4e-9019-2aacda68bc47","Type":"ContainerDied","Data":"739e0c3bda5b06aeff00908644b4d9e39c1f3a83a5a16cd5592a2b3a0a84edfd"} Jan 29 16:46:32 crc kubenswrapper[4886]: I0129 16:46:32.965507 4886 generic.go:334] "Generic (PLEG): container finished" podID="fbfc768f-4803-4f4e-9019-2aacda68bc47" containerID="739e0c3bda5b06aeff00908644b4d9e39c1f3a83a5a16cd5592a2b3a0a84edfd" exitCode=0 Jan 29 16:46:32 crc kubenswrapper[4886]: I0129 16:46:32.968588 4886 generic.go:334] "Generic (PLEG): container finished" podID="d8ab6536-f9ab-4191-9c15-f3fe0453e7d0" containerID="01d7355dcfd37a7bab0f2bcc4a2027184d154d94d7fe052a3562aac5da1f3ea9" exitCode=0 Jan 29 16:46:32 crc kubenswrapper[4886]: I0129 16:46:32.968619 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws2lm" event={"ID":"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0","Type":"ContainerDied","Data":"01d7355dcfd37a7bab0f2bcc4a2027184d154d94d7fe052a3562aac5da1f3ea9"} Jan 29 16:46:33 crc kubenswrapper[4886]: I0129 16:46:33.976620 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bfx" event={"ID":"87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca","Type":"ContainerStarted","Data":"c2358db1a793cf91ba9b1970509b2d9ead3a2a92dd1c2dd79c206d3c2ac53fe1"} Jan 29 16:46:33 crc kubenswrapper[4886]: I0129 16:46:33.979685 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnttp" event={"ID":"fbfc768f-4803-4f4e-9019-2aacda68bc47","Type":"ContainerStarted","Data":"fd2a379c76b14741304253025eccc7f873d5f70c10124608ac47d2565d5b17aa"} Jan 29 16:46:33 crc kubenswrapper[4886]: I0129 16:46:33.982202 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws2lm" event={"ID":"d8ab6536-f9ab-4191-9c15-f3fe0453e7d0","Type":"ContainerStarted","Data":"c4980d3736fdac0444c07a1fb0ca4e2f07d9f6fe2014605185318260906ccd7f"} Jan 29 16:46:34 crc kubenswrapper[4886]: I0129 16:46:34.000099 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-52bfx" podStartSLOduration=4.201101385 podStartE2EDuration="10.000082429s" podCreationTimestamp="2026-01-29 16:46:24 +0000 UTC" firstStartedPulling="2026-01-29 16:46:27.724131115 +0000 UTC m=+1470.632850397" lastFinishedPulling="2026-01-29 16:46:33.523112169 +0000 UTC m=+1476.431831441" observedRunningTime="2026-01-29 16:46:33.99468046 +0000 UTC m=+1476.903399732" watchObservedRunningTime="2026-01-29 16:46:34.000082429 +0000 UTC m=+1476.908801721" Jan 29 16:46:34 crc kubenswrapper[4886]: I0129 16:46:34.019177 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ws2lm" podStartSLOduration=4.376141396 podStartE2EDuration="14.01915911s" podCreationTimestamp="2026-01-29 16:46:20 +0000 UTC" firstStartedPulling="2026-01-29 16:46:23.860158092 +0000 UTC m=+1466.768877404" lastFinishedPulling="2026-01-29 16:46:33.503175806 +0000 UTC m=+1476.411895118" observedRunningTime="2026-01-29 16:46:34.017467376 +0000 UTC m=+1476.926186658" watchObservedRunningTime="2026-01-29 16:46:34.01915911 +0000 UTC m=+1476.927878382" Jan 29 16:46:34 crc kubenswrapper[4886]: I0129 16:46:34.036571 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vnttp" podStartSLOduration=3.429299607 podStartE2EDuration="12.036552157s" podCreationTimestamp="2026-01-29 16:46:22 +0000 UTC" firstStartedPulling="2026-01-29 16:46:24.870618946 +0000 UTC m=+1467.779338218" lastFinishedPulling="2026-01-29 16:46:33.477871486 +0000 UTC m=+1476.386590768" observedRunningTime="2026-01-29 16:46:34.035563261 +0000 UTC m=+1476.944282533" watchObservedRunningTime="2026-01-29 16:46:34.036552157 +0000 UTC m=+1476.945271429" Jan 29 16:46:34 crc kubenswrapper[4886]: I0129 16:46:34.991287 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bdhs" event={"ID":"80e49770-fa31-4780-a5ac-38a6bc1221a9","Type":"ContainerStarted","Data":"ec7b6330f582c97b42a3f4b7b50704b44b590e0a9732dc553abfaf3dade38a3f"} Jan 29 16:46:35 crc kubenswrapper[4886]: I0129 16:46:35.050489 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:35 crc kubenswrapper[4886]: I0129 16:46:35.050544 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:36 crc kubenswrapper[4886]: I0129 16:46:36.096559 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-52bfx" podUID="87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca" containerName="registry-server" probeResult="failure" output=< Jan 29 16:46:36 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 16:46:36 crc kubenswrapper[4886]: > Jan 29 16:46:41 crc kubenswrapper[4886]: I0129 16:46:41.255669 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:41 crc kubenswrapper[4886]: I0129 16:46:41.256181 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:41 crc kubenswrapper[4886]: I0129 16:46:41.325849 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:41 crc kubenswrapper[4886]: I0129 16:46:41.355973 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6bdhs" podStartSLOduration=10.337538849 podStartE2EDuration="18.355947382s" podCreationTimestamp="2026-01-29 16:46:23 +0000 UTC" firstStartedPulling="2026-01-29 16:46:25.8830051 +0000 UTC m=+1468.791724392" lastFinishedPulling="2026-01-29 16:46:33.901413653 +0000 UTC m=+1476.810132925" observedRunningTime="2026-01-29 16:46:35.013719954 +0000 UTC m=+1477.922439236" watchObservedRunningTime="2026-01-29 16:46:41.355947382 +0000 UTC m=+1484.264666684" Jan 29 16:46:42 crc kubenswrapper[4886]: I0129 16:46:42.108568 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ws2lm" Jan 29 16:46:42 crc kubenswrapper[4886]: I0129 16:46:42.625278 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:42 crc kubenswrapper[4886]: I0129 16:46:42.625368 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:42 crc kubenswrapper[4886]: I0129 16:46:42.686438 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:43 crc kubenswrapper[4886]: I0129 16:46:43.105783 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vnttp" Jan 29 16:46:43 crc kubenswrapper[4886]: I0129 16:46:43.616987 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:43 crc kubenswrapper[4886]: I0129 16:46:43.617048 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:43 crc kubenswrapper[4886]: I0129 16:46:43.660432 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:44 crc kubenswrapper[4886]: I0129 16:46:44.114790 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6bdhs" Jan 29 16:46:45 crc kubenswrapper[4886]: I0129 16:46:45.098201 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:46:45 crc kubenswrapper[4886]: I0129 16:46:45.151501 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-52bfx" Jan 29 16:47:29 crc kubenswrapper[4886]: I0129 16:47:29.660957 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:47:29 crc kubenswrapper[4886]: I0129 16:47:29.661414 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:47:59 crc kubenswrapper[4886]: I0129 16:47:59.660608 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:47:59 crc kubenswrapper[4886]: I0129 16:47:59.661308 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:48:02 crc kubenswrapper[4886]: I0129 16:48:02.899290 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8lqx2"] Jan 29 16:48:02 crc kubenswrapper[4886]: I0129 16:48:02.901220 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:02 crc kubenswrapper[4886]: I0129 16:48:02.911565 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8lqx2"] Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.063499 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfkc\" (UniqueName: \"kubernetes.io/projected/860fb30c-4c3d-4f6f-95ff-1de487069087-kube-api-access-bpfkc\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.063561 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-catalog-content\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.063787 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-utilities\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.165418 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfkc\" (UniqueName: \"kubernetes.io/projected/860fb30c-4c3d-4f6f-95ff-1de487069087-kube-api-access-bpfkc\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.165510 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-catalog-content\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.165608 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-utilities\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.166209 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-utilities\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.166965 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-catalog-content\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.189949 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfkc\" (UniqueName: \"kubernetes.io/projected/860fb30c-4c3d-4f6f-95ff-1de487069087-kube-api-access-bpfkc\") pod \"certified-operators-8lqx2\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.228705 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.505086 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8lqx2"] Jan 29 16:48:03 crc kubenswrapper[4886]: I0129 16:48:03.704250 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lqx2" event={"ID":"860fb30c-4c3d-4f6f-95ff-1de487069087","Type":"ContainerStarted","Data":"4a87461a09c78133699165864f57ffc889764f0fa2a316800d5d0c489c5bd1b0"} Jan 29 16:48:04 crc kubenswrapper[4886]: I0129 16:48:04.717645 4886 generic.go:334] "Generic (PLEG): container finished" podID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerID="c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7" exitCode=0 Jan 29 16:48:04 crc kubenswrapper[4886]: I0129 16:48:04.717870 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lqx2" event={"ID":"860fb30c-4c3d-4f6f-95ff-1de487069087","Type":"ContainerDied","Data":"c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7"} Jan 29 16:48:04 crc kubenswrapper[4886]: I0129 16:48:04.721520 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 16:48:06 crc kubenswrapper[4886]: I0129 16:48:06.738207 4886 generic.go:334] "Generic (PLEG): container finished" podID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerID="237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e" exitCode=0 Jan 29 16:48:06 crc kubenswrapper[4886]: I0129 16:48:06.738265 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lqx2" event={"ID":"860fb30c-4c3d-4f6f-95ff-1de487069087","Type":"ContainerDied","Data":"237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e"} Jan 29 16:48:08 crc kubenswrapper[4886]: I0129 16:48:08.758791 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lqx2" event={"ID":"860fb30c-4c3d-4f6f-95ff-1de487069087","Type":"ContainerStarted","Data":"25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd"} Jan 29 16:48:08 crc kubenswrapper[4886]: I0129 16:48:08.781834 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8lqx2" podStartSLOduration=3.755151723 podStartE2EDuration="6.781811368s" podCreationTimestamp="2026-01-29 16:48:02 +0000 UTC" firstStartedPulling="2026-01-29 16:48:04.721173477 +0000 UTC m=+1567.629892759" lastFinishedPulling="2026-01-29 16:48:07.747833092 +0000 UTC m=+1570.656552404" observedRunningTime="2026-01-29 16:48:08.778383491 +0000 UTC m=+1571.687102763" watchObservedRunningTime="2026-01-29 16:48:08.781811368 +0000 UTC m=+1571.690530650" Jan 29 16:48:13 crc kubenswrapper[4886]: I0129 16:48:13.229778 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:13 crc kubenswrapper[4886]: I0129 16:48:13.230459 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:13 crc kubenswrapper[4886]: I0129 16:48:13.294800 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:13 crc kubenswrapper[4886]: I0129 16:48:13.848465 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:13 crc kubenswrapper[4886]: I0129 16:48:13.919937 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8lqx2"] Jan 29 16:48:15 crc kubenswrapper[4886]: I0129 16:48:15.817673 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8lqx2" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerName="registry-server" containerID="cri-o://25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd" gracePeriod=2 Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.375644 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.507927 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpfkc\" (UniqueName: \"kubernetes.io/projected/860fb30c-4c3d-4f6f-95ff-1de487069087-kube-api-access-bpfkc\") pod \"860fb30c-4c3d-4f6f-95ff-1de487069087\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.508100 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-utilities\") pod \"860fb30c-4c3d-4f6f-95ff-1de487069087\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.508147 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-catalog-content\") pod \"860fb30c-4c3d-4f6f-95ff-1de487069087\" (UID: \"860fb30c-4c3d-4f6f-95ff-1de487069087\") " Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.509647 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-utilities" (OuterVolumeSpecName: "utilities") pod "860fb30c-4c3d-4f6f-95ff-1de487069087" (UID: "860fb30c-4c3d-4f6f-95ff-1de487069087"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.517197 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860fb30c-4c3d-4f6f-95ff-1de487069087-kube-api-access-bpfkc" (OuterVolumeSpecName: "kube-api-access-bpfkc") pod "860fb30c-4c3d-4f6f-95ff-1de487069087" (UID: "860fb30c-4c3d-4f6f-95ff-1de487069087"). InnerVolumeSpecName "kube-api-access-bpfkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.559077 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "860fb30c-4c3d-4f6f-95ff-1de487069087" (UID: "860fb30c-4c3d-4f6f-95ff-1de487069087"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.609607 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpfkc\" (UniqueName: \"kubernetes.io/projected/860fb30c-4c3d-4f6f-95ff-1de487069087-kube-api-access-bpfkc\") on node \"crc\" DevicePath \"\"" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.609648 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.609663 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860fb30c-4c3d-4f6f-95ff-1de487069087-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.826926 4886 generic.go:334] "Generic (PLEG): container finished" podID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerID="25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd" exitCode=0 Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.826967 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lqx2" event={"ID":"860fb30c-4c3d-4f6f-95ff-1de487069087","Type":"ContainerDied","Data":"25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd"} Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.826992 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lqx2" event={"ID":"860fb30c-4c3d-4f6f-95ff-1de487069087","Type":"ContainerDied","Data":"4a87461a09c78133699165864f57ffc889764f0fa2a316800d5d0c489c5bd1b0"} Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.827021 4886 scope.go:117] "RemoveContainer" containerID="25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.827075 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8lqx2" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.857788 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8lqx2"] Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.863447 4886 scope.go:117] "RemoveContainer" containerID="237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.868085 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8lqx2"] Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.882316 4886 scope.go:117] "RemoveContainer" containerID="c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.923932 4886 scope.go:117] "RemoveContainer" containerID="25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd" Jan 29 16:48:16 crc kubenswrapper[4886]: E0129 16:48:16.924450 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd\": container with ID starting with 25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd not found: ID does not exist" containerID="25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.924483 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd"} err="failed to get container status \"25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd\": rpc error: code = NotFound desc = could not find container \"25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd\": container with ID starting with 25be302db85a3629c40f39797bdcb5e4d80c59b44b547a44db6482c33891e0dd not found: ID does not exist" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.924507 4886 scope.go:117] "RemoveContainer" containerID="237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e" Jan 29 16:48:16 crc kubenswrapper[4886]: E0129 16:48:16.924929 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e\": container with ID starting with 237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e not found: ID does not exist" containerID="237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.924961 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e"} err="failed to get container status \"237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e\": rpc error: code = NotFound desc = could not find container \"237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e\": container with ID starting with 237729db2181ba06bb5b9a2990ef2432c906b9314a10c99ac22c691a2275eb5e not found: ID does not exist" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.924982 4886 scope.go:117] "RemoveContainer" containerID="c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7" Jan 29 16:48:16 crc kubenswrapper[4886]: E0129 16:48:16.925348 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7\": container with ID starting with c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7 not found: ID does not exist" containerID="c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7" Jan 29 16:48:16 crc kubenswrapper[4886]: I0129 16:48:16.925451 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7"} err="failed to get container status \"c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7\": rpc error: code = NotFound desc = could not find container \"c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7\": container with ID starting with c5afd1cb7edd41e37a61e7964e9a3936fe9580078d8088abebe1e915156bc1d7 not found: ID does not exist" Jan 29 16:48:18 crc kubenswrapper[4886]: I0129 16:48:18.627543 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" path="/var/lib/kubelet/pods/860fb30c-4c3d-4f6f-95ff-1de487069087/volumes" Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.661077 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.661733 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.661796 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.663219 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.663410 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" gracePeriod=600 Jan 29 16:48:29 crc kubenswrapper[4886]: E0129 16:48:29.789849 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.963825 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" exitCode=0 Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.963941 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463"} Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.964035 4886 scope.go:117] "RemoveContainer" containerID="e07342110c4b02787cb4723c63fa377397be4b574d1be34193ab1f7b4cebac54" Jan 29 16:48:29 crc kubenswrapper[4886]: I0129 16:48:29.964936 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:48:29 crc kubenswrapper[4886]: E0129 16:48:29.965469 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:48:42 crc kubenswrapper[4886]: I0129 16:48:42.614967 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:48:42 crc kubenswrapper[4886]: E0129 16:48:42.616120 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.226438 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rb649"] Jan 29 16:48:46 crc kubenswrapper[4886]: E0129 16:48:46.227084 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerName="extract-content" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.227102 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerName="extract-content" Jan 29 16:48:46 crc kubenswrapper[4886]: E0129 16:48:46.227130 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerName="registry-server" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.227138 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerName="registry-server" Jan 29 16:48:46 crc kubenswrapper[4886]: E0129 16:48:46.227158 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerName="extract-utilities" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.227168 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerName="extract-utilities" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.227354 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="860fb30c-4c3d-4f6f-95ff-1de487069087" containerName="registry-server" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.228589 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.250094 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rb649"] Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.340405 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-catalog-content\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.340475 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6jb6\" (UniqueName: \"kubernetes.io/projected/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-kube-api-access-t6jb6\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.340739 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-utilities\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.441999 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-utilities\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.442130 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-catalog-content\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.442171 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6jb6\" (UniqueName: \"kubernetes.io/projected/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-kube-api-access-t6jb6\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.442571 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-utilities\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.443205 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-catalog-content\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.463692 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6jb6\" (UniqueName: \"kubernetes.io/projected/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-kube-api-access-t6jb6\") pod \"redhat-operators-rb649\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:46 crc kubenswrapper[4886]: I0129 16:48:46.548316 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:47 crc kubenswrapper[4886]: I0129 16:48:47.012502 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rb649"] Jan 29 16:48:47 crc kubenswrapper[4886]: I0129 16:48:47.093375 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rb649" event={"ID":"fc1d1fd3-36c5-4b47-bd32-230dc4453e57","Type":"ContainerStarted","Data":"d857247c97994f556a4a4a300a7f0839fd8e562211f2e6ae427fe0ad1d0d3d48"} Jan 29 16:48:48 crc kubenswrapper[4886]: I0129 16:48:48.103800 4886 generic.go:334] "Generic (PLEG): container finished" podID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerID="cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823" exitCode=0 Jan 29 16:48:48 crc kubenswrapper[4886]: I0129 16:48:48.103858 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rb649" event={"ID":"fc1d1fd3-36c5-4b47-bd32-230dc4453e57","Type":"ContainerDied","Data":"cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823"} Jan 29 16:48:49 crc kubenswrapper[4886]: I0129 16:48:49.112830 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rb649" event={"ID":"fc1d1fd3-36c5-4b47-bd32-230dc4453e57","Type":"ContainerStarted","Data":"624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928"} Jan 29 16:48:50 crc kubenswrapper[4886]: I0129 16:48:50.124602 4886 generic.go:334] "Generic (PLEG): container finished" podID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerID="624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928" exitCode=0 Jan 29 16:48:50 crc kubenswrapper[4886]: I0129 16:48:50.124658 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rb649" event={"ID":"fc1d1fd3-36c5-4b47-bd32-230dc4453e57","Type":"ContainerDied","Data":"624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928"} Jan 29 16:48:51 crc kubenswrapper[4886]: I0129 16:48:51.135254 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rb649" event={"ID":"fc1d1fd3-36c5-4b47-bd32-230dc4453e57","Type":"ContainerStarted","Data":"52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c"} Jan 29 16:48:51 crc kubenswrapper[4886]: I0129 16:48:51.157761 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rb649" podStartSLOduration=2.405310719 podStartE2EDuration="5.157739289s" podCreationTimestamp="2026-01-29 16:48:46 +0000 UTC" firstStartedPulling="2026-01-29 16:48:48.105643246 +0000 UTC m=+1611.014362568" lastFinishedPulling="2026-01-29 16:48:50.858071856 +0000 UTC m=+1613.766791138" observedRunningTime="2026-01-29 16:48:51.152785229 +0000 UTC m=+1614.061504501" watchObservedRunningTime="2026-01-29 16:48:51.157739289 +0000 UTC m=+1614.066458601" Jan 29 16:48:56 crc kubenswrapper[4886]: I0129 16:48:56.549398 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:56 crc kubenswrapper[4886]: I0129 16:48:56.550494 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:48:57 crc kubenswrapper[4886]: I0129 16:48:57.601362 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rb649" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="registry-server" probeResult="failure" output=< Jan 29 16:48:57 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 16:48:57 crc kubenswrapper[4886]: > Jan 29 16:48:57 crc kubenswrapper[4886]: I0129 16:48:57.615737 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:48:57 crc kubenswrapper[4886]: E0129 16:48:57.616156 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:48:59 crc kubenswrapper[4886]: I0129 16:48:59.297145 4886 scope.go:117] "RemoveContainer" containerID="5d883c5a30d8f4bbb039e6aaa651b8e09e6b2a8064244a25c33a761d3d8863ae" Jan 29 16:48:59 crc kubenswrapper[4886]: I0129 16:48:59.337564 4886 scope.go:117] "RemoveContainer" containerID="f97710e37d132101bc18cdd88c6b7f51c7d65099d23a9fcf1887c1bba9f84a3e" Jan 29 16:48:59 crc kubenswrapper[4886]: I0129 16:48:59.366178 4886 scope.go:117] "RemoveContainer" containerID="33b121937df6965f1e7c4b97eec963e1caa986d708bab7e6baf54e700c6b9a38" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.789372 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9w7q"] Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.793474 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.807005 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9w7q"] Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.826136 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-utilities\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.826217 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-catalog-content\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.826320 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cltjl\" (UniqueName: \"kubernetes.io/projected/c468bcf2-7186-4ef4-9770-70d4776e478d-kube-api-access-cltjl\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.928383 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cltjl\" (UniqueName: \"kubernetes.io/projected/c468bcf2-7186-4ef4-9770-70d4776e478d-kube-api-access-cltjl\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.928559 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-utilities\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.928606 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-catalog-content\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.929218 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-utilities\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.929296 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-catalog-content\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:01 crc kubenswrapper[4886]: I0129 16:49:01.967786 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cltjl\" (UniqueName: \"kubernetes.io/projected/c468bcf2-7186-4ef4-9770-70d4776e478d-kube-api-access-cltjl\") pod \"community-operators-b9w7q\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:02 crc kubenswrapper[4886]: I0129 16:49:02.126622 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:02 crc kubenswrapper[4886]: W0129 16:49:02.626129 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc468bcf2_7186_4ef4_9770_70d4776e478d.slice/crio-799625d5a7995ef45825f53d082e2fd90ee42fdf7e125df25160449117b36de2 WatchSource:0}: Error finding container 799625d5a7995ef45825f53d082e2fd90ee42fdf7e125df25160449117b36de2: Status 404 returned error can't find the container with id 799625d5a7995ef45825f53d082e2fd90ee42fdf7e125df25160449117b36de2 Jan 29 16:49:02 crc kubenswrapper[4886]: I0129 16:49:02.632203 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9w7q"] Jan 29 16:49:03 crc kubenswrapper[4886]: I0129 16:49:03.226356 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9w7q" event={"ID":"c468bcf2-7186-4ef4-9770-70d4776e478d","Type":"ContainerStarted","Data":"799625d5a7995ef45825f53d082e2fd90ee42fdf7e125df25160449117b36de2"} Jan 29 16:49:04 crc kubenswrapper[4886]: I0129 16:49:04.241407 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9w7q" event={"ID":"c468bcf2-7186-4ef4-9770-70d4776e478d","Type":"ContainerStarted","Data":"082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393"} Jan 29 16:49:05 crc kubenswrapper[4886]: I0129 16:49:05.257124 4886 generic.go:334] "Generic (PLEG): container finished" podID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerID="082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393" exitCode=0 Jan 29 16:49:05 crc kubenswrapper[4886]: I0129 16:49:05.257206 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9w7q" event={"ID":"c468bcf2-7186-4ef4-9770-70d4776e478d","Type":"ContainerDied","Data":"082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393"} Jan 29 16:49:06 crc kubenswrapper[4886]: I0129 16:49:06.606244 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:49:06 crc kubenswrapper[4886]: I0129 16:49:06.677576 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:49:07 crc kubenswrapper[4886]: I0129 16:49:07.278475 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9w7q" event={"ID":"c468bcf2-7186-4ef4-9770-70d4776e478d","Type":"ContainerStarted","Data":"ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8"} Jan 29 16:49:07 crc kubenswrapper[4886]: I0129 16:49:07.726866 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rb649"] Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.291011 4886 generic.go:334] "Generic (PLEG): container finished" podID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerID="ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8" exitCode=0 Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.291063 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9w7q" event={"ID":"c468bcf2-7186-4ef4-9770-70d4776e478d","Type":"ContainerDied","Data":"ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8"} Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.291386 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rb649" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="registry-server" containerID="cri-o://52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c" gracePeriod=2 Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.824224 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.858606 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-utilities\") pod \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.858725 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-catalog-content\") pod \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.858828 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6jb6\" (UniqueName: \"kubernetes.io/projected/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-kube-api-access-t6jb6\") pod \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\" (UID: \"fc1d1fd3-36c5-4b47-bd32-230dc4453e57\") " Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.864830 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-utilities" (OuterVolumeSpecName: "utilities") pod "fc1d1fd3-36c5-4b47-bd32-230dc4453e57" (UID: "fc1d1fd3-36c5-4b47-bd32-230dc4453e57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.872926 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-kube-api-access-t6jb6" (OuterVolumeSpecName: "kube-api-access-t6jb6") pod "fc1d1fd3-36c5-4b47-bd32-230dc4453e57" (UID: "fc1d1fd3-36c5-4b47-bd32-230dc4453e57"). InnerVolumeSpecName "kube-api-access-t6jb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.960813 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6jb6\" (UniqueName: \"kubernetes.io/projected/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-kube-api-access-t6jb6\") on node \"crc\" DevicePath \"\"" Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.960863 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:49:08 crc kubenswrapper[4886]: I0129 16:49:08.997809 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc1d1fd3-36c5-4b47-bd32-230dc4453e57" (UID: "fc1d1fd3-36c5-4b47-bd32-230dc4453e57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.061862 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1d1fd3-36c5-4b47-bd32-230dc4453e57-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.305937 4886 generic.go:334] "Generic (PLEG): container finished" podID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerID="52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c" exitCode=0 Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.306055 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rb649" event={"ID":"fc1d1fd3-36c5-4b47-bd32-230dc4453e57","Type":"ContainerDied","Data":"52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c"} Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.306090 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rb649" event={"ID":"fc1d1fd3-36c5-4b47-bd32-230dc4453e57","Type":"ContainerDied","Data":"d857247c97994f556a4a4a300a7f0839fd8e562211f2e6ae427fe0ad1d0d3d48"} Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.306116 4886 scope.go:117] "RemoveContainer" containerID="52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.306993 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rb649" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.309427 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9w7q" event={"ID":"c468bcf2-7186-4ef4-9770-70d4776e478d","Type":"ContainerStarted","Data":"59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77"} Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.326529 4886 scope.go:117] "RemoveContainer" containerID="624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.339625 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9w7q" podStartSLOduration=5.895748684 podStartE2EDuration="8.33959473s" podCreationTimestamp="2026-01-29 16:49:01 +0000 UTC" firstStartedPulling="2026-01-29 16:49:06.269700144 +0000 UTC m=+1629.178419456" lastFinishedPulling="2026-01-29 16:49:08.71354622 +0000 UTC m=+1631.622265502" observedRunningTime="2026-01-29 16:49:09.332673564 +0000 UTC m=+1632.241392886" watchObservedRunningTime="2026-01-29 16:49:09.33959473 +0000 UTC m=+1632.248314012" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.357900 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rb649"] Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.367652 4886 scope.go:117] "RemoveContainer" containerID="cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.372058 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rb649"] Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.389586 4886 scope.go:117] "RemoveContainer" containerID="52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c" Jan 29 16:49:09 crc kubenswrapper[4886]: E0129 16:49:09.389823 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c\": container with ID starting with 52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c not found: ID does not exist" containerID="52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.389877 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c"} err="failed to get container status \"52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c\": rpc error: code = NotFound desc = could not find container \"52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c\": container with ID starting with 52e7407b95f1e3d37b25c372e80a9917554036fb5d36e571babdf608c6ab8b2c not found: ID does not exist" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.389915 4886 scope.go:117] "RemoveContainer" containerID="624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928" Jan 29 16:49:09 crc kubenswrapper[4886]: E0129 16:49:09.390176 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928\": container with ID starting with 624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928 not found: ID does not exist" containerID="624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.390214 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928"} err="failed to get container status \"624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928\": rpc error: code = NotFound desc = could not find container \"624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928\": container with ID starting with 624f5139f1a2c50f96cd70304d37713a103819d2077781d21599f155b38e0928 not found: ID does not exist" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.390231 4886 scope.go:117] "RemoveContainer" containerID="cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823" Jan 29 16:49:09 crc kubenswrapper[4886]: E0129 16:49:09.390594 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823\": container with ID starting with cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823 not found: ID does not exist" containerID="cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823" Jan 29 16:49:09 crc kubenswrapper[4886]: I0129 16:49:09.390634 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823"} err="failed to get container status \"cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823\": rpc error: code = NotFound desc = could not find container \"cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823\": container with ID starting with cc46e50228c504a5ce69248aa0c8fc04aed2d8481106f72d24ed44ddb5847823 not found: ID does not exist" Jan 29 16:49:10 crc kubenswrapper[4886]: I0129 16:49:10.632630 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" path="/var/lib/kubelet/pods/fc1d1fd3-36c5-4b47-bd32-230dc4453e57/volumes" Jan 29 16:49:11 crc kubenswrapper[4886]: I0129 16:49:11.616010 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:49:11 crc kubenswrapper[4886]: E0129 16:49:11.616844 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:49:12 crc kubenswrapper[4886]: I0129 16:49:12.127115 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:12 crc kubenswrapper[4886]: I0129 16:49:12.127652 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:12 crc kubenswrapper[4886]: I0129 16:49:12.201893 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:13 crc kubenswrapper[4886]: I0129 16:49:13.434610 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:13 crc kubenswrapper[4886]: I0129 16:49:13.718042 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9w7q"] Jan 29 16:49:15 crc kubenswrapper[4886]: I0129 16:49:15.377016 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9w7q" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerName="registry-server" containerID="cri-o://59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77" gracePeriod=2 Jan 29 16:49:15 crc kubenswrapper[4886]: I0129 16:49:15.878473 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.005373 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-utilities\") pod \"c468bcf2-7186-4ef4-9770-70d4776e478d\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.005446 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cltjl\" (UniqueName: \"kubernetes.io/projected/c468bcf2-7186-4ef4-9770-70d4776e478d-kube-api-access-cltjl\") pod \"c468bcf2-7186-4ef4-9770-70d4776e478d\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.005530 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-catalog-content\") pod \"c468bcf2-7186-4ef4-9770-70d4776e478d\" (UID: \"c468bcf2-7186-4ef4-9770-70d4776e478d\") " Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.006672 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-utilities" (OuterVolumeSpecName: "utilities") pod "c468bcf2-7186-4ef4-9770-70d4776e478d" (UID: "c468bcf2-7186-4ef4-9770-70d4776e478d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.015564 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c468bcf2-7186-4ef4-9770-70d4776e478d-kube-api-access-cltjl" (OuterVolumeSpecName: "kube-api-access-cltjl") pod "c468bcf2-7186-4ef4-9770-70d4776e478d" (UID: "c468bcf2-7186-4ef4-9770-70d4776e478d"). InnerVolumeSpecName "kube-api-access-cltjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.105409 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c468bcf2-7186-4ef4-9770-70d4776e478d" (UID: "c468bcf2-7186-4ef4-9770-70d4776e478d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.106964 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.106996 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cltjl\" (UniqueName: \"kubernetes.io/projected/c468bcf2-7186-4ef4-9770-70d4776e478d-kube-api-access-cltjl\") on node \"crc\" DevicePath \"\"" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.107009 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c468bcf2-7186-4ef4-9770-70d4776e478d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.388093 4886 generic.go:334] "Generic (PLEG): container finished" podID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerID="59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77" exitCode=0 Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.388136 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9w7q" event={"ID":"c468bcf2-7186-4ef4-9770-70d4776e478d","Type":"ContainerDied","Data":"59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77"} Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.388171 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9w7q" event={"ID":"c468bcf2-7186-4ef4-9770-70d4776e478d","Type":"ContainerDied","Data":"799625d5a7995ef45825f53d082e2fd90ee42fdf7e125df25160449117b36de2"} Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.388190 4886 scope.go:117] "RemoveContainer" containerID="59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.388186 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9w7q" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.413319 4886 scope.go:117] "RemoveContainer" containerID="ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.426423 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9w7q"] Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.431122 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9w7q"] Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.458770 4886 scope.go:117] "RemoveContainer" containerID="082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.478902 4886 scope.go:117] "RemoveContainer" containerID="59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77" Jan 29 16:49:16 crc kubenswrapper[4886]: E0129 16:49:16.479264 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77\": container with ID starting with 59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77 not found: ID does not exist" containerID="59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.479308 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77"} err="failed to get container status \"59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77\": rpc error: code = NotFound desc = could not find container \"59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77\": container with ID starting with 59cc412193b3130b39141a3f157a2a8998aa61ddecddcf310dee6b51ec2ffe77 not found: ID does not exist" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.479377 4886 scope.go:117] "RemoveContainer" containerID="ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8" Jan 29 16:49:16 crc kubenswrapper[4886]: E0129 16:49:16.479681 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8\": container with ID starting with ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8 not found: ID does not exist" containerID="ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.479718 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8"} err="failed to get container status \"ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8\": rpc error: code = NotFound desc = could not find container \"ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8\": container with ID starting with ac6f611b16c6f5d7856add64806d578e7d1ff0562407cf21ec433ec91447a1e8 not found: ID does not exist" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.479783 4886 scope.go:117] "RemoveContainer" containerID="082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393" Jan 29 16:49:16 crc kubenswrapper[4886]: E0129 16:49:16.480021 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393\": container with ID starting with 082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393 not found: ID does not exist" containerID="082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.480049 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393"} err="failed to get container status \"082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393\": rpc error: code = NotFound desc = could not find container \"082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393\": container with ID starting with 082c04bad5b5edb39719d59fe983024d447a170e7b3cb883e5e9c1dec4786393 not found: ID does not exist" Jan 29 16:49:16 crc kubenswrapper[4886]: I0129 16:49:16.625425 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" path="/var/lib/kubelet/pods/c468bcf2-7186-4ef4-9770-70d4776e478d/volumes" Jan 29 16:49:23 crc kubenswrapper[4886]: I0129 16:49:23.615720 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:49:23 crc kubenswrapper[4886]: E0129 16:49:23.616315 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:49:37 crc kubenswrapper[4886]: I0129 16:49:37.614921 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:49:37 crc kubenswrapper[4886]: E0129 16:49:37.615957 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:49:52 crc kubenswrapper[4886]: I0129 16:49:52.615037 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:49:52 crc kubenswrapper[4886]: E0129 16:49:52.616537 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:49:59 crc kubenswrapper[4886]: I0129 16:49:59.429162 4886 scope.go:117] "RemoveContainer" containerID="c3183e31247098ddd97f7b27ad0dbf70d02daf691b6fbd6a4595181aba6a0ae9" Jan 29 16:49:59 crc kubenswrapper[4886]: I0129 16:49:59.464509 4886 scope.go:117] "RemoveContainer" containerID="0e60e37f19cf29954ac9598d39f3e907b0a8fd7df0f8e5321feafa568cea256e" Jan 29 16:49:59 crc kubenswrapper[4886]: I0129 16:49:59.496945 4886 scope.go:117] "RemoveContainer" containerID="82c9ec7fc7823b99a453ab6558f3f2d190f9fc013e02e7613db77aca6c9d421f" Jan 29 16:49:59 crc kubenswrapper[4886]: I0129 16:49:59.526589 4886 scope.go:117] "RemoveContainer" containerID="e4cccb4d486fe60f0edfb4f7f715ab8d92c12f9f9f4a1cfe4e00c4adc5c34b51" Jan 29 16:49:59 crc kubenswrapper[4886]: I0129 16:49:59.561114 4886 scope.go:117] "RemoveContainer" containerID="a37b6266b19c1ce3a441dff00e8cafa9669109c4ad6f2385f4502687f4af460a" Jan 29 16:49:59 crc kubenswrapper[4886]: I0129 16:49:59.581731 4886 scope.go:117] "RemoveContainer" containerID="8d122cad021ce2744d255a9dc7ff90dfde7fd82fdce7705c91c1c86d943ebbab" Jan 29 16:50:07 crc kubenswrapper[4886]: I0129 16:50:07.615875 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:50:07 crc kubenswrapper[4886]: E0129 16:50:07.617068 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:50:22 crc kubenswrapper[4886]: I0129 16:50:22.615813 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:50:22 crc kubenswrapper[4886]: E0129 16:50:22.616824 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:50:34 crc kubenswrapper[4886]: I0129 16:50:34.615595 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:50:34 crc kubenswrapper[4886]: E0129 16:50:34.616515 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:50:46 crc kubenswrapper[4886]: I0129 16:50:46.615707 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:50:46 crc kubenswrapper[4886]: E0129 16:50:46.616860 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:51:00 crc kubenswrapper[4886]: I0129 16:51:00.614842 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:51:00 crc kubenswrapper[4886]: E0129 16:51:00.615647 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:51:12 crc kubenswrapper[4886]: I0129 16:51:12.615042 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:51:12 crc kubenswrapper[4886]: E0129 16:51:12.616044 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:51:23 crc kubenswrapper[4886]: I0129 16:51:23.615269 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:51:23 crc kubenswrapper[4886]: E0129 16:51:23.615996 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:51:34 crc kubenswrapper[4886]: I0129 16:51:34.616008 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:51:34 crc kubenswrapper[4886]: E0129 16:51:34.617018 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:51:45 crc kubenswrapper[4886]: I0129 16:51:45.615710 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:51:45 crc kubenswrapper[4886]: E0129 16:51:45.616699 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:51:58 crc kubenswrapper[4886]: I0129 16:51:58.621855 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:51:58 crc kubenswrapper[4886]: E0129 16:51:58.622921 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:52:10 crc kubenswrapper[4886]: I0129 16:52:10.615819 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:52:10 crc kubenswrapper[4886]: E0129 16:52:10.616997 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:52:21 crc kubenswrapper[4886]: I0129 16:52:21.614897 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:52:21 crc kubenswrapper[4886]: E0129 16:52:21.615916 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:52:35 crc kubenswrapper[4886]: I0129 16:52:35.614918 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:52:35 crc kubenswrapper[4886]: E0129 16:52:35.616074 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:52:47 crc kubenswrapper[4886]: I0129 16:52:47.615745 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:52:47 crc kubenswrapper[4886]: E0129 16:52:47.616693 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:53:01 crc kubenswrapper[4886]: I0129 16:53:01.615689 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:53:01 crc kubenswrapper[4886]: E0129 16:53:01.616506 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:53:13 crc kubenswrapper[4886]: I0129 16:53:13.615600 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:53:13 crc kubenswrapper[4886]: E0129 16:53:13.616523 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:53:24 crc kubenswrapper[4886]: I0129 16:53:24.615768 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:53:24 crc kubenswrapper[4886]: E0129 16:53:24.616799 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 16:53:38 crc kubenswrapper[4886]: I0129 16:53:38.617904 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:53:39 crc kubenswrapper[4886]: I0129 16:53:39.659776 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"8ef97582eea2927ab131d16b422621b32afa666846864a223a782bc24fb0ddda"} Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.248213 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn"] Jan 29 16:54:44 crc kubenswrapper[4886]: E0129 16:54:44.249130 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerName="extract-content" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.249147 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerName="extract-content" Jan 29 16:54:44 crc kubenswrapper[4886]: E0129 16:54:44.249166 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="extract-utilities" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.249174 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="extract-utilities" Jan 29 16:54:44 crc kubenswrapper[4886]: E0129 16:54:44.249186 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="extract-content" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.249196 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="extract-content" Jan 29 16:54:44 crc kubenswrapper[4886]: E0129 16:54:44.249211 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerName="extract-utilities" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.249219 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerName="extract-utilities" Jan 29 16:54:44 crc kubenswrapper[4886]: E0129 16:54:44.249235 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="registry-server" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.249242 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="registry-server" Jan 29 16:54:44 crc kubenswrapper[4886]: E0129 16:54:44.249258 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerName="registry-server" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.249266 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerName="registry-server" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.249450 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1d1fd3-36c5-4b47-bd32-230dc4453e57" containerName="registry-server" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.249471 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c468bcf2-7186-4ef4-9770-70d4776e478d" containerName="registry-server" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.250912 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.254007 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.255428 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn"] Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.319770 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.319845 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.319933 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrrs\" (UniqueName: \"kubernetes.io/projected/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-kube-api-access-hsrrs\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.421477 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.421569 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrrs\" (UniqueName: \"kubernetes.io/projected/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-kube-api-access-hsrrs\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.421689 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.422177 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.422203 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.452756 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrrs\" (UniqueName: \"kubernetes.io/projected/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-kube-api-access-hsrrs\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:44 crc kubenswrapper[4886]: I0129 16:54:44.569062 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:45 crc kubenswrapper[4886]: I0129 16:54:45.130376 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn"] Jan 29 16:54:45 crc kubenswrapper[4886]: I0129 16:54:45.224637 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" event={"ID":"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7","Type":"ContainerStarted","Data":"31a22a4610e4bc5ef385c72ff41fe37a167bc49f9af10fc97aab59455595fd80"} Jan 29 16:54:46 crc kubenswrapper[4886]: I0129 16:54:46.234293 4886 generic.go:334] "Generic (PLEG): container finished" podID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerID="f9304c3928205846747e1e5c7f125f756ac7129c49ff039ab595e9d33a42c1cc" exitCode=0 Jan 29 16:54:46 crc kubenswrapper[4886]: I0129 16:54:46.234397 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" event={"ID":"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7","Type":"ContainerDied","Data":"f9304c3928205846747e1e5c7f125f756ac7129c49ff039ab595e9d33a42c1cc"} Jan 29 16:54:46 crc kubenswrapper[4886]: I0129 16:54:46.236644 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 16:54:52 crc kubenswrapper[4886]: I0129 16:54:52.290135 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" event={"ID":"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7","Type":"ContainerStarted","Data":"b84010ce0c4043243efacf85a0fcfff301b23fe298fb04de818639758c93f7fb"} Jan 29 16:54:53 crc kubenswrapper[4886]: I0129 16:54:53.298765 4886 generic.go:334] "Generic (PLEG): container finished" podID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerID="b84010ce0c4043243efacf85a0fcfff301b23fe298fb04de818639758c93f7fb" exitCode=0 Jan 29 16:54:53 crc kubenswrapper[4886]: I0129 16:54:53.298834 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" event={"ID":"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7","Type":"ContainerDied","Data":"b84010ce0c4043243efacf85a0fcfff301b23fe298fb04de818639758c93f7fb"} Jan 29 16:54:54 crc kubenswrapper[4886]: I0129 16:54:54.317145 4886 generic.go:334] "Generic (PLEG): container finished" podID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerID="5f67d03a2e4c4b97cf1d3b14a5246c56dee6336601fe7d4e5a56a15ac76c14ea" exitCode=0 Jan 29 16:54:54 crc kubenswrapper[4886]: I0129 16:54:54.317757 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" event={"ID":"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7","Type":"ContainerDied","Data":"5f67d03a2e4c4b97cf1d3b14a5246c56dee6336601fe7d4e5a56a15ac76c14ea"} Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.607900 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.743361 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-bundle\") pod \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.743433 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-util\") pod \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.743471 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsrrs\" (UniqueName: \"kubernetes.io/projected/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-kube-api-access-hsrrs\") pod \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\" (UID: \"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7\") " Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.744226 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-bundle" (OuterVolumeSpecName: "bundle") pod "1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" (UID: "1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.749691 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-kube-api-access-hsrrs" (OuterVolumeSpecName: "kube-api-access-hsrrs") pod "1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" (UID: "1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7"). InnerVolumeSpecName "kube-api-access-hsrrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.756961 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-util" (OuterVolumeSpecName: "util") pod "1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" (UID: "1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.845936 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.845983 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-util\") on node \"crc\" DevicePath \"\"" Jan 29 16:54:55 crc kubenswrapper[4886]: I0129 16:54:55.845999 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsrrs\" (UniqueName: \"kubernetes.io/projected/1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7-kube-api-access-hsrrs\") on node \"crc\" DevicePath \"\"" Jan 29 16:54:56 crc kubenswrapper[4886]: I0129 16:54:56.339461 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" event={"ID":"1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7","Type":"ContainerDied","Data":"31a22a4610e4bc5ef385c72ff41fe37a167bc49f9af10fc97aab59455595fd80"} Jan 29 16:54:56 crc kubenswrapper[4886]: I0129 16:54:56.339841 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31a22a4610e4bc5ef385c72ff41fe37a167bc49f9af10fc97aab59455595fd80" Jan 29 16:54:56 crc kubenswrapper[4886]: I0129 16:54:56.339566 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.868431 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-xn5zh"] Jan 29 16:55:00 crc kubenswrapper[4886]: E0129 16:55:00.869077 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerName="pull" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.869093 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerName="pull" Jan 29 16:55:00 crc kubenswrapper[4886]: E0129 16:55:00.869121 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerName="extract" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.869131 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerName="extract" Jan 29 16:55:00 crc kubenswrapper[4886]: E0129 16:55:00.869150 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerName="util" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.869158 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerName="util" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.869317 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7" containerName="extract" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.870042 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-xn5zh" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.874249 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.874688 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.875034 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-vhlcq" Jan 29 16:55:00 crc kubenswrapper[4886]: I0129 16:55:00.877958 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-xn5zh"] Jan 29 16:55:01 crc kubenswrapper[4886]: I0129 16:55:01.027674 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmbc9\" (UniqueName: \"kubernetes.io/projected/64313301-3779-4923-949f-b8de5c30b5bb-kube-api-access-zmbc9\") pod \"nmstate-operator-646758c888-xn5zh\" (UID: \"64313301-3779-4923-949f-b8de5c30b5bb\") " pod="openshift-nmstate/nmstate-operator-646758c888-xn5zh" Jan 29 16:55:01 crc kubenswrapper[4886]: I0129 16:55:01.129652 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmbc9\" (UniqueName: \"kubernetes.io/projected/64313301-3779-4923-949f-b8de5c30b5bb-kube-api-access-zmbc9\") pod \"nmstate-operator-646758c888-xn5zh\" (UID: \"64313301-3779-4923-949f-b8de5c30b5bb\") " pod="openshift-nmstate/nmstate-operator-646758c888-xn5zh" Jan 29 16:55:01 crc kubenswrapper[4886]: I0129 16:55:01.158474 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmbc9\" (UniqueName: \"kubernetes.io/projected/64313301-3779-4923-949f-b8de5c30b5bb-kube-api-access-zmbc9\") pod \"nmstate-operator-646758c888-xn5zh\" (UID: \"64313301-3779-4923-949f-b8de5c30b5bb\") " pod="openshift-nmstate/nmstate-operator-646758c888-xn5zh" Jan 29 16:55:01 crc kubenswrapper[4886]: I0129 16:55:01.186607 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-xn5zh" Jan 29 16:55:01 crc kubenswrapper[4886]: I0129 16:55:01.589493 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-xn5zh"] Jan 29 16:55:01 crc kubenswrapper[4886]: W0129 16:55:01.594089 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64313301_3779_4923_949f_b8de5c30b5bb.slice/crio-5e7235fed240676037b6e2c21d2766320f11b421eb1b437ae533a71b82db565e WatchSource:0}: Error finding container 5e7235fed240676037b6e2c21d2766320f11b421eb1b437ae533a71b82db565e: Status 404 returned error can't find the container with id 5e7235fed240676037b6e2c21d2766320f11b421eb1b437ae533a71b82db565e Jan 29 16:55:02 crc kubenswrapper[4886]: I0129 16:55:02.384474 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-xn5zh" event={"ID":"64313301-3779-4923-949f-b8de5c30b5bb","Type":"ContainerStarted","Data":"5e7235fed240676037b6e2c21d2766320f11b421eb1b437ae533a71b82db565e"} Jan 29 16:55:05 crc kubenswrapper[4886]: I0129 16:55:05.415901 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-xn5zh" event={"ID":"64313301-3779-4923-949f-b8de5c30b5bb","Type":"ContainerStarted","Data":"0c88d0777aef9b64a944eb8b10ddd89037fa93f3b62d43586509a0c2743e4d27"} Jan 29 16:55:05 crc kubenswrapper[4886]: I0129 16:55:05.454549 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-xn5zh" podStartSLOduration=2.759281174 podStartE2EDuration="5.454521s" podCreationTimestamp="2026-01-29 16:55:00 +0000 UTC" firstStartedPulling="2026-01-29 16:55:01.596229828 +0000 UTC m=+1984.504949090" lastFinishedPulling="2026-01-29 16:55:04.291469604 +0000 UTC m=+1987.200188916" observedRunningTime="2026-01-29 16:55:05.442800378 +0000 UTC m=+1988.351519690" watchObservedRunningTime="2026-01-29 16:55:05.454521 +0000 UTC m=+1988.363240302" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.438940 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-ntx9m"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.440735 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.449855 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.450710 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.451180 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vr2f5" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.452725 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.463601 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-ntx9m"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.482671 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.494706 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-9lh4n"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.495669 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.526952 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdkg4\" (UniqueName: \"kubernetes.io/projected/c42903b0-c0d4-4c39-bed3-3c9d083e753d-kube-api-access-gdkg4\") pod \"nmstate-webhook-8474b5b9d8-mv5wp\" (UID: \"c42903b0-c0d4-4c39-bed3-3c9d083e753d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.527020 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c42903b0-c0d4-4c39-bed3-3c9d083e753d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mv5wp\" (UID: \"c42903b0-c0d4-4c39-bed3-3c9d083e753d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.527054 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wwx9\" (UniqueName: \"kubernetes.io/projected/515c481a-e563-41c3-b5ff-d5957faf5217-kube-api-access-4wwx9\") pod \"nmstate-metrics-54757c584b-ntx9m\" (UID: \"515c481a-e563-41c3-b5ff-d5957faf5217\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.586748 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.587652 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.590045 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.590074 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.590045 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-cszgj" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.604123 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.629154 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-ovs-socket\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.629201 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdkg4\" (UniqueName: \"kubernetes.io/projected/c42903b0-c0d4-4c39-bed3-3c9d083e753d-kube-api-access-gdkg4\") pod \"nmstate-webhook-8474b5b9d8-mv5wp\" (UID: \"c42903b0-c0d4-4c39-bed3-3c9d083e753d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.629235 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c42903b0-c0d4-4c39-bed3-3c9d083e753d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mv5wp\" (UID: \"c42903b0-c0d4-4c39-bed3-3c9d083e753d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.629269 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wwx9\" (UniqueName: \"kubernetes.io/projected/515c481a-e563-41c3-b5ff-d5957faf5217-kube-api-access-4wwx9\") pod \"nmstate-metrics-54757c584b-ntx9m\" (UID: \"515c481a-e563-41c3-b5ff-d5957faf5217\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.629287 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-nmstate-lock\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.629307 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v52p5\" (UniqueName: \"kubernetes.io/projected/848b9df5-c882-4017-b1ad-6ac496646a76-kube-api-access-v52p5\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.629342 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-dbus-socket\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: E0129 16:55:06.629758 4886 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 29 16:55:06 crc kubenswrapper[4886]: E0129 16:55:06.629802 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c42903b0-c0d4-4c39-bed3-3c9d083e753d-tls-key-pair podName:c42903b0-c0d4-4c39-bed3-3c9d083e753d nodeName:}" failed. No retries permitted until 2026-01-29 16:55:07.129788239 +0000 UTC m=+1990.038507511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/c42903b0-c0d4-4c39-bed3-3c9d083e753d-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-mv5wp" (UID: "c42903b0-c0d4-4c39-bed3-3c9d083e753d") : secret "openshift-nmstate-webhook" not found Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.648764 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdkg4\" (UniqueName: \"kubernetes.io/projected/c42903b0-c0d4-4c39-bed3-3c9d083e753d-kube-api-access-gdkg4\") pod \"nmstate-webhook-8474b5b9d8-mv5wp\" (UID: \"c42903b0-c0d4-4c39-bed3-3c9d083e753d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.648788 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wwx9\" (UniqueName: \"kubernetes.io/projected/515c481a-e563-41c3-b5ff-d5957faf5217-kube-api-access-4wwx9\") pod \"nmstate-metrics-54757c584b-ntx9m\" (UID: \"515c481a-e563-41c3-b5ff-d5957faf5217\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731211 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2814fca3-5ea5-4b77-aad5-0308881c88bb-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731299 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2814fca3-5ea5-4b77-aad5-0308881c88bb-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731364 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-nmstate-lock\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731388 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v52p5\" (UniqueName: \"kubernetes.io/projected/848b9df5-c882-4017-b1ad-6ac496646a76-kube-api-access-v52p5\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731434 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-dbus-socket\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731464 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-nmstate-lock\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731796 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-dbus-socket\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731869 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdz7\" (UniqueName: \"kubernetes.io/projected/2814fca3-5ea5-4b77-aad5-0308881c88bb-kube-api-access-hbdz7\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.731940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-ovs-socket\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.732035 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/848b9df5-c882-4017-b1ad-6ac496646a76-ovs-socket\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.755231 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v52p5\" (UniqueName: \"kubernetes.io/projected/848b9df5-c882-4017-b1ad-6ac496646a76-kube-api-access-v52p5\") pod \"nmstate-handler-9lh4n\" (UID: \"848b9df5-c882-4017-b1ad-6ac496646a76\") " pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.766414 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.779093 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d44f9f6d-wvkcd"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.780076 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.796279 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d44f9f6d-wvkcd"] Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.815077 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.848411 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdz7\" (UniqueName: \"kubernetes.io/projected/2814fca3-5ea5-4b77-aad5-0308881c88bb-kube-api-access-hbdz7\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.848520 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2814fca3-5ea5-4b77-aad5-0308881c88bb-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.848585 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2814fca3-5ea5-4b77-aad5-0308881c88bb-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: E0129 16:55:06.848823 4886 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 29 16:55:06 crc kubenswrapper[4886]: E0129 16:55:06.848881 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2814fca3-5ea5-4b77-aad5-0308881c88bb-plugin-serving-cert podName:2814fca3-5ea5-4b77-aad5-0308881c88bb nodeName:}" failed. No retries permitted until 2026-01-29 16:55:07.348860193 +0000 UTC m=+1990.257579465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/2814fca3-5ea5-4b77-aad5-0308881c88bb-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-d4tp4" (UID: "2814fca3-5ea5-4b77-aad5-0308881c88bb") : secret "plugin-serving-cert" not found Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.850273 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2814fca3-5ea5-4b77-aad5-0308881c88bb-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: W0129 16:55:06.850817 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848b9df5_c882_4017_b1ad_6ac496646a76.slice/crio-f2c5b96b2c7501c982fdee9d1d0fabac6399f29a42f0cea1334609a6d68f31b8 WatchSource:0}: Error finding container f2c5b96b2c7501c982fdee9d1d0fabac6399f29a42f0cea1334609a6d68f31b8: Status 404 returned error can't find the container with id f2c5b96b2c7501c982fdee9d1d0fabac6399f29a42f0cea1334609a6d68f31b8 Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.871457 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdz7\" (UniqueName: \"kubernetes.io/projected/2814fca3-5ea5-4b77-aad5-0308881c88bb-kube-api-access-hbdz7\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.951804 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-oauth-config\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.951882 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt776\" (UniqueName: \"kubernetes.io/projected/d7eb0acf-dfc4-4c24-8231-bfae5b620653-kube-api-access-vt776\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.951958 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-serving-cert\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.951976 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-oauth-serving-cert\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.952726 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-config\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.952777 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-trusted-ca-bundle\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:06 crc kubenswrapper[4886]: I0129 16:55:06.952825 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-service-ca\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.053823 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-serving-cert\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.053869 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-oauth-serving-cert\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.053896 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-config\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.053943 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-trusted-ca-bundle\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.053974 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-service-ca\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.054020 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-oauth-config\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.054050 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt776\" (UniqueName: \"kubernetes.io/projected/d7eb0acf-dfc4-4c24-8231-bfae5b620653-kube-api-access-vt776\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.054945 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-oauth-serving-cert\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.055025 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-config\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.055279 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-service-ca\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.055289 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-trusted-ca-bundle\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.059129 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-serving-cert\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.059791 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-oauth-config\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.072076 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt776\" (UniqueName: \"kubernetes.io/projected/d7eb0acf-dfc4-4c24-8231-bfae5b620653-kube-api-access-vt776\") pod \"console-7d44f9f6d-wvkcd\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.155415 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c42903b0-c0d4-4c39-bed3-3c9d083e753d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mv5wp\" (UID: \"c42903b0-c0d4-4c39-bed3-3c9d083e753d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.161955 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c42903b0-c0d4-4c39-bed3-3c9d083e753d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mv5wp\" (UID: \"c42903b0-c0d4-4c39-bed3-3c9d083e753d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.171002 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.259230 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-ntx9m"] Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.357975 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2814fca3-5ea5-4b77-aad5-0308881c88bb-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.363935 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2814fca3-5ea5-4b77-aad5-0308881c88bb-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-d4tp4\" (UID: \"2814fca3-5ea5-4b77-aad5-0308881c88bb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.376790 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.431539 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" event={"ID":"515c481a-e563-41c3-b5ff-d5957faf5217","Type":"ContainerStarted","Data":"e5c35961f61b0ca142eff5912053441fa3277d22ff63b267234ff963c21cb123"} Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.432528 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9lh4n" event={"ID":"848b9df5-c882-4017-b1ad-6ac496646a76","Type":"ContainerStarted","Data":"f2c5b96b2c7501c982fdee9d1d0fabac6399f29a42f0cea1334609a6d68f31b8"} Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.504977 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.616908 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d44f9f6d-wvkcd"] Jan 29 16:55:07 crc kubenswrapper[4886]: W0129 16:55:07.636080 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7eb0acf_dfc4_4c24_8231_bfae5b620653.slice/crio-2dde3f8777f56361bbc961c320b3499545e524fdb56d2e7e1762b3c549f1e8ca WatchSource:0}: Error finding container 2dde3f8777f56361bbc961c320b3499545e524fdb56d2e7e1762b3c549f1e8ca: Status 404 returned error can't find the container with id 2dde3f8777f56361bbc961c320b3499545e524fdb56d2e7e1762b3c549f1e8ca Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.781952 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp"] Jan 29 16:55:07 crc kubenswrapper[4886]: W0129 16:55:07.789926 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42903b0_c0d4_4c39_bed3_3c9d083e753d.slice/crio-0e30b42164a13454d5d21ae69e07d6266d571f80251f6d10abf5e6f5aebabbb6 WatchSource:0}: Error finding container 0e30b42164a13454d5d21ae69e07d6266d571f80251f6d10abf5e6f5aebabbb6: Status 404 returned error can't find the container with id 0e30b42164a13454d5d21ae69e07d6266d571f80251f6d10abf5e6f5aebabbb6 Jan 29 16:55:07 crc kubenswrapper[4886]: I0129 16:55:07.997177 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4"] Jan 29 16:55:07 crc kubenswrapper[4886]: W0129 16:55:07.999847 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2814fca3_5ea5_4b77_aad5_0308881c88bb.slice/crio-049994e5706296f07a69bbc94ba72048dae5fb4de71dcf90e17e5808b2460a14 WatchSource:0}: Error finding container 049994e5706296f07a69bbc94ba72048dae5fb4de71dcf90e17e5808b2460a14: Status 404 returned error can't find the container with id 049994e5706296f07a69bbc94ba72048dae5fb4de71dcf90e17e5808b2460a14 Jan 29 16:55:08 crc kubenswrapper[4886]: I0129 16:55:08.444133 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" event={"ID":"c42903b0-c0d4-4c39-bed3-3c9d083e753d","Type":"ContainerStarted","Data":"0e30b42164a13454d5d21ae69e07d6266d571f80251f6d10abf5e6f5aebabbb6"} Jan 29 16:55:08 crc kubenswrapper[4886]: I0129 16:55:08.446560 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d44f9f6d-wvkcd" event={"ID":"d7eb0acf-dfc4-4c24-8231-bfae5b620653","Type":"ContainerStarted","Data":"83d754bde6259c4ef4756a1b0a86efc202f6d81cccfa70e563b1ad9cae41b68f"} Jan 29 16:55:08 crc kubenswrapper[4886]: I0129 16:55:08.446659 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d44f9f6d-wvkcd" event={"ID":"d7eb0acf-dfc4-4c24-8231-bfae5b620653","Type":"ContainerStarted","Data":"2dde3f8777f56361bbc961c320b3499545e524fdb56d2e7e1762b3c549f1e8ca"} Jan 29 16:55:08 crc kubenswrapper[4886]: I0129 16:55:08.450060 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" event={"ID":"2814fca3-5ea5-4b77-aad5-0308881c88bb","Type":"ContainerStarted","Data":"049994e5706296f07a69bbc94ba72048dae5fb4de71dcf90e17e5808b2460a14"} Jan 29 16:55:08 crc kubenswrapper[4886]: I0129 16:55:08.479402 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d44f9f6d-wvkcd" podStartSLOduration=2.47937693 podStartE2EDuration="2.47937693s" podCreationTimestamp="2026-01-29 16:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:55:08.474055834 +0000 UTC m=+1991.382775126" watchObservedRunningTime="2026-01-29 16:55:08.47937693 +0000 UTC m=+1991.388096232" Jan 29 16:55:10 crc kubenswrapper[4886]: I0129 16:55:10.480945 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" event={"ID":"c42903b0-c0d4-4c39-bed3-3c9d083e753d","Type":"ContainerStarted","Data":"dd9b27566a3e9b114a8a1aa3238466a492e42e9371d79541dc76fd2dc3448c5b"} Jan 29 16:55:10 crc kubenswrapper[4886]: I0129 16:55:10.481703 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:10 crc kubenswrapper[4886]: I0129 16:55:10.483397 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" event={"ID":"515c481a-e563-41c3-b5ff-d5957faf5217","Type":"ContainerStarted","Data":"c4a932209da16152e09d8640c43a6fdc4ec5c4b4650ffd8b919c9dffacd5926c"} Jan 29 16:55:10 crc kubenswrapper[4886]: I0129 16:55:10.485737 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9lh4n" event={"ID":"848b9df5-c882-4017-b1ad-6ac496646a76","Type":"ContainerStarted","Data":"975c6cd9ca3f769059b929b0357a188bbb30200e72ab2d272a5f623c49997894"} Jan 29 16:55:10 crc kubenswrapper[4886]: I0129 16:55:10.485946 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:10 crc kubenswrapper[4886]: I0129 16:55:10.506023 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" podStartSLOduration=2.293318841 podStartE2EDuration="4.506008612s" podCreationTimestamp="2026-01-29 16:55:06 +0000 UTC" firstStartedPulling="2026-01-29 16:55:07.791626371 +0000 UTC m=+1990.700345663" lastFinishedPulling="2026-01-29 16:55:10.004316162 +0000 UTC m=+1992.913035434" observedRunningTime="2026-01-29 16:55:10.504678365 +0000 UTC m=+1993.413397637" watchObservedRunningTime="2026-01-29 16:55:10.506008612 +0000 UTC m=+1993.414727884" Jan 29 16:55:10 crc kubenswrapper[4886]: I0129 16:55:10.526040 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-9lh4n" podStartSLOduration=1.446406159 podStartE2EDuration="4.52602132s" podCreationTimestamp="2026-01-29 16:55:06 +0000 UTC" firstStartedPulling="2026-01-29 16:55:06.879862192 +0000 UTC m=+1989.788581464" lastFinishedPulling="2026-01-29 16:55:09.959477293 +0000 UTC m=+1992.868196625" observedRunningTime="2026-01-29 16:55:10.520850049 +0000 UTC m=+1993.429569341" watchObservedRunningTime="2026-01-29 16:55:10.52602132 +0000 UTC m=+1993.434740592" Jan 29 16:55:11 crc kubenswrapper[4886]: I0129 16:55:11.495348 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" event={"ID":"2814fca3-5ea5-4b77-aad5-0308881c88bb","Type":"ContainerStarted","Data":"485dc32f331852b42eca3bac4a6fb624e25cbce299256c1ef555e1e33c7a90d4"} Jan 29 16:55:11 crc kubenswrapper[4886]: I0129 16:55:11.514677 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4tp4" podStartSLOduration=2.359282699 podStartE2EDuration="5.514639545s" podCreationTimestamp="2026-01-29 16:55:06 +0000 UTC" firstStartedPulling="2026-01-29 16:55:08.001836512 +0000 UTC m=+1990.910555784" lastFinishedPulling="2026-01-29 16:55:11.157193338 +0000 UTC m=+1994.065912630" observedRunningTime="2026-01-29 16:55:11.51410197 +0000 UTC m=+1994.422821252" watchObservedRunningTime="2026-01-29 16:55:11.514639545 +0000 UTC m=+1994.423358817" Jan 29 16:55:13 crc kubenswrapper[4886]: I0129 16:55:13.513221 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" event={"ID":"515c481a-e563-41c3-b5ff-d5957faf5217","Type":"ContainerStarted","Data":"e2f541822966161e051c7b85f7a4d92b179228cca604f78b7d8a1fa10421b2ef"} Jan 29 16:55:13 crc kubenswrapper[4886]: I0129 16:55:13.540657 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-ntx9m" podStartSLOduration=2.487780318 podStartE2EDuration="7.540626239s" podCreationTimestamp="2026-01-29 16:55:06 +0000 UTC" firstStartedPulling="2026-01-29 16:55:07.274137428 +0000 UTC m=+1990.182856700" lastFinishedPulling="2026-01-29 16:55:12.326983339 +0000 UTC m=+1995.235702621" observedRunningTime="2026-01-29 16:55:13.534571783 +0000 UTC m=+1996.443291095" watchObservedRunningTime="2026-01-29 16:55:13.540626239 +0000 UTC m=+1996.449345541" Jan 29 16:55:16 crc kubenswrapper[4886]: I0129 16:55:16.861079 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-9lh4n" Jan 29 16:55:17 crc kubenswrapper[4886]: I0129 16:55:17.172135 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:17 crc kubenswrapper[4886]: I0129 16:55:17.172221 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:17 crc kubenswrapper[4886]: I0129 16:55:17.177569 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:17 crc kubenswrapper[4886]: I0129 16:55:17.556269 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 16:55:17 crc kubenswrapper[4886]: I0129 16:55:17.637981 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-664586d6fb-g55cf"] Jan 29 16:55:27 crc kubenswrapper[4886]: I0129 16:55:27.383630 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mv5wp" Jan 29 16:55:42 crc kubenswrapper[4886]: I0129 16:55:42.695543 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-664586d6fb-g55cf" podUID="42357e7c-de03-4b8b-80f5-f946411c67f7" containerName="console" containerID="cri-o://6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38" gracePeriod=15 Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.102667 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-664586d6fb-g55cf_42357e7c-de03-4b8b-80f5-f946411c67f7/console/0.log" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.103054 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.182434 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-oauth-config\") pod \"42357e7c-de03-4b8b-80f5-f946411c67f7\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.182553 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-console-config\") pod \"42357e7c-de03-4b8b-80f5-f946411c67f7\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.182599 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-trusted-ca-bundle\") pod \"42357e7c-de03-4b8b-80f5-f946411c67f7\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.182671 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-service-ca\") pod \"42357e7c-de03-4b8b-80f5-f946411c67f7\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.182706 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-oauth-serving-cert\") pod \"42357e7c-de03-4b8b-80f5-f946411c67f7\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.182788 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln452\" (UniqueName: \"kubernetes.io/projected/42357e7c-de03-4b8b-80f5-f946411c67f7-kube-api-access-ln452\") pod \"42357e7c-de03-4b8b-80f5-f946411c67f7\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.182898 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-serving-cert\") pod \"42357e7c-de03-4b8b-80f5-f946411c67f7\" (UID: \"42357e7c-de03-4b8b-80f5-f946411c67f7\") " Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.183617 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "42357e7c-de03-4b8b-80f5-f946411c67f7" (UID: "42357e7c-de03-4b8b-80f5-f946411c67f7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.183630 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-service-ca" (OuterVolumeSpecName: "service-ca") pod "42357e7c-de03-4b8b-80f5-f946411c67f7" (UID: "42357e7c-de03-4b8b-80f5-f946411c67f7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.183644 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "42357e7c-de03-4b8b-80f5-f946411c67f7" (UID: "42357e7c-de03-4b8b-80f5-f946411c67f7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.183703 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-console-config" (OuterVolumeSpecName: "console-config") pod "42357e7c-de03-4b8b-80f5-f946411c67f7" (UID: "42357e7c-de03-4b8b-80f5-f946411c67f7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.184302 4886 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-console-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.184335 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.184344 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.184352 4886 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42357e7c-de03-4b8b-80f5-f946411c67f7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.188890 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42357e7c-de03-4b8b-80f5-f946411c67f7-kube-api-access-ln452" (OuterVolumeSpecName: "kube-api-access-ln452") pod "42357e7c-de03-4b8b-80f5-f946411c67f7" (UID: "42357e7c-de03-4b8b-80f5-f946411c67f7"). InnerVolumeSpecName "kube-api-access-ln452". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.189459 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "42357e7c-de03-4b8b-80f5-f946411c67f7" (UID: "42357e7c-de03-4b8b-80f5-f946411c67f7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.190146 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "42357e7c-de03-4b8b-80f5-f946411c67f7" (UID: "42357e7c-de03-4b8b-80f5-f946411c67f7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.286369 4886 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.286724 4886 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42357e7c-de03-4b8b-80f5-f946411c67f7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.286736 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln452\" (UniqueName: \"kubernetes.io/projected/42357e7c-de03-4b8b-80f5-f946411c67f7-kube-api-access-ln452\") on node \"crc\" DevicePath \"\"" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.781785 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-664586d6fb-g55cf_42357e7c-de03-4b8b-80f5-f946411c67f7/console/0.log" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.781834 4886 generic.go:334] "Generic (PLEG): container finished" podID="42357e7c-de03-4b8b-80f5-f946411c67f7" containerID="6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38" exitCode=2 Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.781871 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664586d6fb-g55cf" event={"ID":"42357e7c-de03-4b8b-80f5-f946411c67f7","Type":"ContainerDied","Data":"6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38"} Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.781903 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664586d6fb-g55cf" event={"ID":"42357e7c-de03-4b8b-80f5-f946411c67f7","Type":"ContainerDied","Data":"4c6fe087595c24e70608f508c9599d4ead9e60d5c503746f12585384b13bc295"} Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.781925 4886 scope.go:117] "RemoveContainer" containerID="6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.782039 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664586d6fb-g55cf" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.810914 4886 scope.go:117] "RemoveContainer" containerID="6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38" Jan 29 16:55:43 crc kubenswrapper[4886]: E0129 16:55:43.811293 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38\": container with ID starting with 6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38 not found: ID does not exist" containerID="6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.811355 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38"} err="failed to get container status \"6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38\": rpc error: code = NotFound desc = could not find container \"6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38\": container with ID starting with 6019dfcf6dda95ddc80718ca451b48d8dede9d785bf016b5b0c27dcf7bc93e38 not found: ID does not exist" Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.815696 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-664586d6fb-g55cf"] Jan 29 16:55:43 crc kubenswrapper[4886]: I0129 16:55:43.821615 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-664586d6fb-g55cf"] Jan 29 16:55:44 crc kubenswrapper[4886]: I0129 16:55:44.624965 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42357e7c-de03-4b8b-80f5-f946411c67f7" path="/var/lib/kubelet/pods/42357e7c-de03-4b8b-80f5-f946411c67f7/volumes" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.241288 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5"] Jan 29 16:55:58 crc kubenswrapper[4886]: E0129 16:55:58.242217 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42357e7c-de03-4b8b-80f5-f946411c67f7" containerName="console" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.242237 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="42357e7c-de03-4b8b-80f5-f946411c67f7" containerName="console" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.242499 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="42357e7c-de03-4b8b-80f5-f946411c67f7" containerName="console" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.244144 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.256675 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.258850 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5"] Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.359235 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.359540 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dq4\" (UniqueName: \"kubernetes.io/projected/aa613edd-15e0-466f-8739-ab30f6d61801-kube-api-access-z8dq4\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.359699 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.461617 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.461734 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8dq4\" (UniqueName: \"kubernetes.io/projected/aa613edd-15e0-466f-8739-ab30f6d61801-kube-api-access-z8dq4\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.461801 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.462266 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.462274 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.485709 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8dq4\" (UniqueName: \"kubernetes.io/projected/aa613edd-15e0-466f-8739-ab30f6d61801-kube-api-access-z8dq4\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.588552 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 16:55:58 crc kubenswrapper[4886]: I0129 16:55:58.596534 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:55:59 crc kubenswrapper[4886]: I0129 16:55:59.066291 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5"] Jan 29 16:55:59 crc kubenswrapper[4886]: W0129 16:55:59.071934 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa613edd_15e0_466f_8739_ab30f6d61801.slice/crio-8be2bbeba35f6a3a828f1cf712135895f74da1f19506f9a662f89c6ac9ba1865 WatchSource:0}: Error finding container 8be2bbeba35f6a3a828f1cf712135895f74da1f19506f9a662f89c6ac9ba1865: Status 404 returned error can't find the container with id 8be2bbeba35f6a3a828f1cf712135895f74da1f19506f9a662f89c6ac9ba1865 Jan 29 16:55:59 crc kubenswrapper[4886]: I0129 16:55:59.661211 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:55:59 crc kubenswrapper[4886]: I0129 16:55:59.661596 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:55:59 crc kubenswrapper[4886]: I0129 16:55:59.934793 4886 generic.go:334] "Generic (PLEG): container finished" podID="aa613edd-15e0-466f-8739-ab30f6d61801" containerID="b3ba887bb48636a071a891e42be18b55f6a9e2fbc6239ddf3528ab05267a3a5f" exitCode=0 Jan 29 16:55:59 crc kubenswrapper[4886]: I0129 16:55:59.934842 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" event={"ID":"aa613edd-15e0-466f-8739-ab30f6d61801","Type":"ContainerDied","Data":"b3ba887bb48636a071a891e42be18b55f6a9e2fbc6239ddf3528ab05267a3a5f"} Jan 29 16:55:59 crc kubenswrapper[4886]: I0129 16:55:59.934869 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" event={"ID":"aa613edd-15e0-466f-8739-ab30f6d61801","Type":"ContainerStarted","Data":"8be2bbeba35f6a3a828f1cf712135895f74da1f19506f9a662f89c6ac9ba1865"} Jan 29 16:56:00 crc kubenswrapper[4886]: E0129 16:56:00.069340 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87" Jan 29 16:56:00 crc kubenswrapper[4886]: E0129 16:56:00.069510 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8dq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_openshift-marketplace(aa613edd-15e0-466f-8739-ab30f6d61801): ErrImagePull: initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:56:00 crc kubenswrapper[4886]: E0129 16:56:00.070711 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" Jan 29 16:56:00 crc kubenswrapper[4886]: E0129 16:56:00.941948 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87\\\"\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" Jan 29 16:56:15 crc kubenswrapper[4886]: E0129 16:56:15.763649 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87" Jan 29 16:56:15 crc kubenswrapper[4886]: E0129 16:56:15.764367 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8dq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_openshift-marketplace(aa613edd-15e0-466f-8739-ab30f6d61801): ErrImagePull: initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:56:15 crc kubenswrapper[4886]: E0129 16:56:15.765673 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" Jan 29 16:56:21 crc kubenswrapper[4886]: I0129 16:56:21.797460 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m4fv5"] Jan 29 16:56:21 crc kubenswrapper[4886]: I0129 16:56:21.805622 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:21 crc kubenswrapper[4886]: I0129 16:56:21.809123 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4fv5"] Jan 29 16:56:21 crc kubenswrapper[4886]: I0129 16:56:21.991585 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kfqs\" (UniqueName: \"kubernetes.io/projected/3e333f39-f93b-4066-8e9f-4bd27e4d3672-kube-api-access-6kfqs\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:21 crc kubenswrapper[4886]: I0129 16:56:21.991732 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-catalog-content\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:21 crc kubenswrapper[4886]: I0129 16:56:21.991918 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-utilities\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:22 crc kubenswrapper[4886]: I0129 16:56:22.095150 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-utilities\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:22 crc kubenswrapper[4886]: I0129 16:56:22.095393 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kfqs\" (UniqueName: \"kubernetes.io/projected/3e333f39-f93b-4066-8e9f-4bd27e4d3672-kube-api-access-6kfqs\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:22 crc kubenswrapper[4886]: I0129 16:56:22.095479 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-catalog-content\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:22 crc kubenswrapper[4886]: I0129 16:56:22.095776 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-utilities\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:22 crc kubenswrapper[4886]: I0129 16:56:22.095997 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-catalog-content\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:22 crc kubenswrapper[4886]: I0129 16:56:22.119351 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kfqs\" (UniqueName: \"kubernetes.io/projected/3e333f39-f93b-4066-8e9f-4bd27e4d3672-kube-api-access-6kfqs\") pod \"redhat-marketplace-m4fv5\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:22 crc kubenswrapper[4886]: I0129 16:56:22.129275 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:56:22 crc kubenswrapper[4886]: I0129 16:56:22.531490 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4fv5"] Jan 29 16:56:23 crc kubenswrapper[4886]: I0129 16:56:23.119885 4886 generic.go:334] "Generic (PLEG): container finished" podID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerID="54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026" exitCode=0 Jan 29 16:56:23 crc kubenswrapper[4886]: I0129 16:56:23.119934 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4fv5" event={"ID":"3e333f39-f93b-4066-8e9f-4bd27e4d3672","Type":"ContainerDied","Data":"54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026"} Jan 29 16:56:23 crc kubenswrapper[4886]: I0129 16:56:23.120253 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4fv5" event={"ID":"3e333f39-f93b-4066-8e9f-4bd27e4d3672","Type":"ContainerStarted","Data":"721f687c812954ac213bf098f41dc7b5630da2bcf0b09ba3c2bdd27881939e63"} Jan 29 16:56:23 crc kubenswrapper[4886]: E0129 16:56:23.253657 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:56:23 crc kubenswrapper[4886]: E0129 16:56:23.253844 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kfqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-m4fv5_openshift-marketplace(3e333f39-f93b-4066-8e9f-4bd27e4d3672): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:56:23 crc kubenswrapper[4886]: E0129 16:56:23.255063 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:56:24 crc kubenswrapper[4886]: E0129 16:56:24.132013 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:56:29 crc kubenswrapper[4886]: E0129 16:56:29.617881 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87\\\"\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" Jan 29 16:56:29 crc kubenswrapper[4886]: I0129 16:56:29.660867 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:56:29 crc kubenswrapper[4886]: I0129 16:56:29.660924 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:56:35 crc kubenswrapper[4886]: E0129 16:56:35.747924 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:56:35 crc kubenswrapper[4886]: E0129 16:56:35.749147 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kfqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-m4fv5_openshift-marketplace(3e333f39-f93b-4066-8e9f-4bd27e4d3672): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:56:35 crc kubenswrapper[4886]: E0129 16:56:35.750514 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:56:44 crc kubenswrapper[4886]: E0129 16:56:44.746030 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87" Jan 29 16:56:44 crc kubenswrapper[4886]: E0129 16:56:44.746667 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8dq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_openshift-marketplace(aa613edd-15e0-466f-8739-ab30f6d61801): ErrImagePull: initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:56:44 crc kubenswrapper[4886]: E0129 16:56:44.747863 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"initializing source docker://registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" Jan 29 16:56:47 crc kubenswrapper[4886]: E0129 16:56:47.616897 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:56:58 crc kubenswrapper[4886]: E0129 16:56:58.619880 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87\\\"\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" Jan 29 16:56:59 crc kubenswrapper[4886]: I0129 16:56:59.661179 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:56:59 crc kubenswrapper[4886]: I0129 16:56:59.662077 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:56:59 crc kubenswrapper[4886]: I0129 16:56:59.662214 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 16:56:59 crc kubenswrapper[4886]: I0129 16:56:59.662938 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ef97582eea2927ab131d16b422621b32afa666846864a223a782bc24fb0ddda"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 16:56:59 crc kubenswrapper[4886]: I0129 16:56:59.663122 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://8ef97582eea2927ab131d16b422621b32afa666846864a223a782bc24fb0ddda" gracePeriod=600 Jan 29 16:56:59 crc kubenswrapper[4886]: E0129 16:56:59.746848 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:56:59 crc kubenswrapper[4886]: E0129 16:56:59.747000 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kfqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-m4fv5_openshift-marketplace(3e333f39-f93b-4066-8e9f-4bd27e4d3672): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:56:59 crc kubenswrapper[4886]: E0129 16:56:59.748440 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:57:00 crc kubenswrapper[4886]: I0129 16:57:00.452581 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="8ef97582eea2927ab131d16b422621b32afa666846864a223a782bc24fb0ddda" exitCode=0 Jan 29 16:57:00 crc kubenswrapper[4886]: I0129 16:57:00.452664 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"8ef97582eea2927ab131d16b422621b32afa666846864a223a782bc24fb0ddda"} Jan 29 16:57:00 crc kubenswrapper[4886]: I0129 16:57:00.452711 4886 scope.go:117] "RemoveContainer" containerID="705ca471a878082d4a93a73d2095863766a13245174606f1f47cdefc4bd2e463" Jan 29 16:57:01 crc kubenswrapper[4886]: I0129 16:57:01.463934 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc"} Jan 29 16:57:11 crc kubenswrapper[4886]: E0129 16:57:11.619284 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87\\\"\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" Jan 29 16:57:13 crc kubenswrapper[4886]: E0129 16:57:13.618708 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:57:23 crc kubenswrapper[4886]: E0129 16:57:23.617564 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:43205585b4bfcac18bfdf918280b62fe382a0d7926e6fdbea5edd703fa57cd87\\\"\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" Jan 29 16:57:26 crc kubenswrapper[4886]: E0129 16:57:26.617799 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:57:39 crc kubenswrapper[4886]: I0129 16:57:39.809724 4886 generic.go:334] "Generic (PLEG): container finished" podID="aa613edd-15e0-466f-8739-ab30f6d61801" containerID="ca5d820f84d33a6787485746a40ce0ca702d98726bdfd28f0b841d12759cdee5" exitCode=0 Jan 29 16:57:39 crc kubenswrapper[4886]: I0129 16:57:39.810012 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" event={"ID":"aa613edd-15e0-466f-8739-ab30f6d61801","Type":"ContainerDied","Data":"ca5d820f84d33a6787485746a40ce0ca702d98726bdfd28f0b841d12759cdee5"} Jan 29 16:57:40 crc kubenswrapper[4886]: I0129 16:57:40.833630 4886 generic.go:334] "Generic (PLEG): container finished" podID="aa613edd-15e0-466f-8739-ab30f6d61801" containerID="be763c4ea500c4509b35f741338737b9173afba1ba0428d16b5db6b158cc301f" exitCode=0 Jan 29 16:57:40 crc kubenswrapper[4886]: I0129 16:57:40.833747 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" event={"ID":"aa613edd-15e0-466f-8739-ab30f6d61801","Type":"ContainerDied","Data":"be763c4ea500c4509b35f741338737b9173afba1ba0428d16b5db6b158cc301f"} Jan 29 16:57:41 crc kubenswrapper[4886]: E0129 16:57:41.741113 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 16:57:41 crc kubenswrapper[4886]: E0129 16:57:41.741721 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kfqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-m4fv5_openshift-marketplace(3e333f39-f93b-4066-8e9f-4bd27e4d3672): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:57:41 crc kubenswrapper[4886]: E0129 16:57:41.743043 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.203860 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.327316 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-bundle\") pod \"aa613edd-15e0-466f-8739-ab30f6d61801\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.327415 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8dq4\" (UniqueName: \"kubernetes.io/projected/aa613edd-15e0-466f-8739-ab30f6d61801-kube-api-access-z8dq4\") pod \"aa613edd-15e0-466f-8739-ab30f6d61801\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.327431 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-util\") pod \"aa613edd-15e0-466f-8739-ab30f6d61801\" (UID: \"aa613edd-15e0-466f-8739-ab30f6d61801\") " Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.328750 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-bundle" (OuterVolumeSpecName: "bundle") pod "aa613edd-15e0-466f-8739-ab30f6d61801" (UID: "aa613edd-15e0-466f-8739-ab30f6d61801"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.333620 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa613edd-15e0-466f-8739-ab30f6d61801-kube-api-access-z8dq4" (OuterVolumeSpecName: "kube-api-access-z8dq4") pod "aa613edd-15e0-466f-8739-ab30f6d61801" (UID: "aa613edd-15e0-466f-8739-ab30f6d61801"). InnerVolumeSpecName "kube-api-access-z8dq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.338928 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-util" (OuterVolumeSpecName: "util") pod "aa613edd-15e0-466f-8739-ab30f6d61801" (UID: "aa613edd-15e0-466f-8739-ab30f6d61801"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.429593 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.429633 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8dq4\" (UniqueName: \"kubernetes.io/projected/aa613edd-15e0-466f-8739-ab30f6d61801-kube-api-access-z8dq4\") on node \"crc\" DevicePath \"\"" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.429642 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa613edd-15e0-466f-8739-ab30f6d61801-util\") on node \"crc\" DevicePath \"\"" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.851435 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" event={"ID":"aa613edd-15e0-466f-8739-ab30f6d61801","Type":"ContainerDied","Data":"8be2bbeba35f6a3a828f1cf712135895f74da1f19506f9a662f89c6ac9ba1865"} Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.851493 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8be2bbeba35f6a3a828f1cf712135895f74da1f19506f9a662f89c6ac9ba1865" Jan 29 16:57:42 crc kubenswrapper[4886]: I0129 16:57:42.851521 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.066247 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k"] Jan 29 16:57:53 crc kubenswrapper[4886]: E0129 16:57:53.067127 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" containerName="pull" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.067141 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" containerName="pull" Jan 29 16:57:53 crc kubenswrapper[4886]: E0129 16:57:53.067162 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" containerName="extract" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.067168 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" containerName="extract" Jan 29 16:57:53 crc kubenswrapper[4886]: E0129 16:57:53.067183 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" containerName="util" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.067189 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" containerName="util" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.067353 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa613edd-15e0-466f-8739-ab30f6d61801" containerName="extract" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.067899 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.070690 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.072084 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.072173 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.072219 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fp46d" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.073130 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.111802 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k"] Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.213755 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc960811-7f19-4248-8d44-e3ffcb98d650-apiservice-cert\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.213884 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdp59\" (UniqueName: \"kubernetes.io/projected/dc960811-7f19-4248-8d44-e3ffcb98d650-kube-api-access-gdp59\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.213950 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc960811-7f19-4248-8d44-e3ffcb98d650-webhook-cert\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.315925 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdp59\" (UniqueName: \"kubernetes.io/projected/dc960811-7f19-4248-8d44-e3ffcb98d650-kube-api-access-gdp59\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.316064 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc960811-7f19-4248-8d44-e3ffcb98d650-webhook-cert\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.316128 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc960811-7f19-4248-8d44-e3ffcb98d650-apiservice-cert\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.323928 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc960811-7f19-4248-8d44-e3ffcb98d650-webhook-cert\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.332534 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdp59\" (UniqueName: \"kubernetes.io/projected/dc960811-7f19-4248-8d44-e3ffcb98d650-kube-api-access-gdp59\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.332902 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc960811-7f19-4248-8d44-e3ffcb98d650-apiservice-cert\") pod \"metallb-operator-controller-manager-77cfddbbb9-wbb7k\" (UID: \"dc960811-7f19-4248-8d44-e3ffcb98d650\") " pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.401285 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt"] Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.402952 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.406631 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.406865 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.407320 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-kthzn" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.409312 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt"] Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.412838 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.519045 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a88b1900-1763-4d6c-9b3a-62598ab57eda-apiservice-cert\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.519132 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a88b1900-1763-4d6c-9b3a-62598ab57eda-webhook-cert\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.519479 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4lxq\" (UniqueName: \"kubernetes.io/projected/a88b1900-1763-4d6c-9b3a-62598ab57eda-kube-api-access-h4lxq\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.625318 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a88b1900-1763-4d6c-9b3a-62598ab57eda-apiservice-cert\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.625413 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a88b1900-1763-4d6c-9b3a-62598ab57eda-webhook-cert\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.625538 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4lxq\" (UniqueName: \"kubernetes.io/projected/a88b1900-1763-4d6c-9b3a-62598ab57eda-kube-api-access-h4lxq\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.648387 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4lxq\" (UniqueName: \"kubernetes.io/projected/a88b1900-1763-4d6c-9b3a-62598ab57eda-kube-api-access-h4lxq\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.653981 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a88b1900-1763-4d6c-9b3a-62598ab57eda-apiservice-cert\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.663006 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a88b1900-1763-4d6c-9b3a-62598ab57eda-webhook-cert\") pod \"metallb-operator-webhook-server-96d4668dd-sb2zt\" (UID: \"a88b1900-1763-4d6c-9b3a-62598ab57eda\") " pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:53 crc kubenswrapper[4886]: I0129 16:57:53.720060 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:57:54 crc kubenswrapper[4886]: I0129 16:57:54.050823 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt"] Jan 29 16:57:54 crc kubenswrapper[4886]: W0129 16:57:54.056364 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda88b1900_1763_4d6c_9b3a_62598ab57eda.slice/crio-2b112fa27eef35a6793dab8a7c5b4bb512aac25a538c8c5bb4daa66864da7e80 WatchSource:0}: Error finding container 2b112fa27eef35a6793dab8a7c5b4bb512aac25a538c8c5bb4daa66864da7e80: Status 404 returned error can't find the container with id 2b112fa27eef35a6793dab8a7c5b4bb512aac25a538c8c5bb4daa66864da7e80 Jan 29 16:57:54 crc kubenswrapper[4886]: I0129 16:57:54.065100 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k"] Jan 29 16:57:54 crc kubenswrapper[4886]: W0129 16:57:54.071953 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc960811_7f19_4248_8d44_e3ffcb98d650.slice/crio-64f89935ee5dfac6c771d94b60d52920c35dad896ce51393f15d74cbbeb48d5b WatchSource:0}: Error finding container 64f89935ee5dfac6c771d94b60d52920c35dad896ce51393f15d74cbbeb48d5b: Status 404 returned error can't find the container with id 64f89935ee5dfac6c771d94b60d52920c35dad896ce51393f15d74cbbeb48d5b Jan 29 16:57:54 crc kubenswrapper[4886]: E0129 16:57:54.193782 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d" Jan 29 16:57:54 crc kubenswrapper[4886]: E0129 16:57:54.193975 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:webhook-server,Image:registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d,Command:[/controller],Args:[--disable-cert-rotation=true --port=7472 --log-level=info --webhook-mode=onlywebhook],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7472,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:METALLB_BGP_TYPE,Value:frr,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202601071645,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4lxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000730000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-webhook-server-96d4668dd-sb2zt_metallb-system(a88b1900-1763-4d6c-9b3a-62598ab57eda): ErrImagePull: initializing source docker://registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:57:54 crc kubenswrapper[4886]: E0129 16:57:54.195134 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ErrImagePull: \"initializing source docker://registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" podUID="a88b1900-1763-4d6c-9b3a-62598ab57eda" Jan 29 16:57:54 crc kubenswrapper[4886]: E0129 16:57:54.198491 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:dc12a5ec124aac3c8fa5d1a9c9e063b1854864dc58e0e3ed02e01bf8e5eaaae0: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:dc12a5ec124aac3c8fa5d1a9c9e063b1854864dc58e0e3ed02e01bf8e5eaaae0" Jan 29 16:57:54 crc kubenswrapper[4886]: E0129 16:57:54.198742 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:dc12a5ec124aac3c8fa5d1a9c9e063b1854864dc58e0e3ed02e01bf8e5eaaae0,Command:[/manager],Args:[--enable-leader-election --disable-cert-rotation=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:SPEAKER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d,ValueFrom:nil,},EnvVar{Name:CONTROLLER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d,ValueFrom:nil,},EnvVar{Name:FRR_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:b4dd345e67e0d4f80968f2f04aac4d5da1ce02a3b880502867e03d4fa46d3862,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:86800d7a823cf444db8393dd7ffa735b2e42e9120f3f869487b0a2ed6b0db73d,ValueFrom:nil,},EnvVar{Name:DEPLOY_KUBE_RBAC_PROXIES,Value:true,ValueFrom:nil,},EnvVar{Name:FRRK8S_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:b4dd345e67e0d4f80968f2f04aac4d5da1ce02a3b880502867e03d4fa46d3862,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:DEPLOY_PODMONITORS,Value:false,ValueFrom:nil,},EnvVar{Name:DEPLOY_SERVICEMONITORS,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_OPERATOR_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_PORT,Value:29150,ValueFrom:nil,},EnvVar{Name:HTTPS_METRICS_PORT,Value:9120,ValueFrom:nil,},EnvVar{Name:FRR_METRICS_PORT,Value:29151,ValueFrom:nil,},EnvVar{Name:FRR_HTTPS_METRICS_PORT,Value:9121,ValueFrom:nil,},EnvVar{Name:MEMBER_LIST_BIND_PORT,Value:9122,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202601071645,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdp59,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000730000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-controller-manager-77cfddbbb9-wbb7k_metallb-system(dc960811-7f19-4248-8d44-e3ffcb98d650): ErrImagePull: initializing source docker://registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:dc12a5ec124aac3c8fa5d1a9c9e063b1854864dc58e0e3ed02e01bf8e5eaaae0: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 16:57:54 crc kubenswrapper[4886]: E0129 16:57:54.199978 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"initializing source docker://registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:dc12a5ec124aac3c8fa5d1a9c9e063b1854864dc58e0e3ed02e01bf8e5eaaae0: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" podUID="dc960811-7f19-4248-8d44-e3ffcb98d650" Jan 29 16:57:54 crc kubenswrapper[4886]: I0129 16:57:54.938306 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" event={"ID":"dc960811-7f19-4248-8d44-e3ffcb98d650","Type":"ContainerStarted","Data":"64f89935ee5dfac6c771d94b60d52920c35dad896ce51393f15d74cbbeb48d5b"} Jan 29 16:57:54 crc kubenswrapper[4886]: I0129 16:57:54.939664 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" event={"ID":"a88b1900-1763-4d6c-9b3a-62598ab57eda","Type":"ContainerStarted","Data":"2b112fa27eef35a6793dab8a7c5b4bb512aac25a538c8c5bb4daa66864da7e80"} Jan 29 16:57:54 crc kubenswrapper[4886]: E0129 16:57:54.941135 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d\\\"\"" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" podUID="a88b1900-1763-4d6c-9b3a-62598ab57eda" Jan 29 16:57:54 crc kubenswrapper[4886]: E0129 16:57:54.949865 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:dc12a5ec124aac3c8fa5d1a9c9e063b1854864dc58e0e3ed02e01bf8e5eaaae0\\\"\"" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" podUID="dc960811-7f19-4248-8d44-e3ffcb98d650" Jan 29 16:57:55 crc kubenswrapper[4886]: E0129 16:57:55.953093 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:dc12a5ec124aac3c8fa5d1a9c9e063b1854864dc58e0e3ed02e01bf8e5eaaae0\\\"\"" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" podUID="dc960811-7f19-4248-8d44-e3ffcb98d650" Jan 29 16:57:55 crc kubenswrapper[4886]: E0129 16:57:55.953084 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:dfdc96eec0d63a5abd9e75003d3ed847582118f9cc839ad1094baf866733699d\\\"\"" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" podUID="a88b1900-1763-4d6c-9b3a-62598ab57eda" Jan 29 16:57:56 crc kubenswrapper[4886]: E0129 16:57:56.616194 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.399657 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2qvtg"] Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.402103 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.418700 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qvtg"] Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.591726 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkts7\" (UniqueName: \"kubernetes.io/projected/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-kube-api-access-mkts7\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.591847 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-catalog-content\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.591878 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-utilities\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.693482 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-catalog-content\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.693523 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-utilities\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.693620 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkts7\" (UniqueName: \"kubernetes.io/projected/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-kube-api-access-mkts7\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.694061 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-utilities\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.694080 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-catalog-content\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.720717 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkts7\" (UniqueName: \"kubernetes.io/projected/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-kube-api-access-mkts7\") pod \"certified-operators-2qvtg\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:03 crc kubenswrapper[4886]: I0129 16:58:03.722461 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:04 crc kubenswrapper[4886]: I0129 16:58:04.225387 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qvtg"] Jan 29 16:58:05 crc kubenswrapper[4886]: I0129 16:58:05.021057 4886 generic.go:334] "Generic (PLEG): container finished" podID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerID="9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a" exitCode=0 Jan 29 16:58:05 crc kubenswrapper[4886]: I0129 16:58:05.021165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qvtg" event={"ID":"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d","Type":"ContainerDied","Data":"9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a"} Jan 29 16:58:05 crc kubenswrapper[4886]: I0129 16:58:05.021467 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qvtg" event={"ID":"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d","Type":"ContainerStarted","Data":"8c993bdc28773309bb449abbfde8c5ecf5591ad63dba75097c127b6a364bd347"} Jan 29 16:58:07 crc kubenswrapper[4886]: I0129 16:58:07.037345 4886 generic.go:334] "Generic (PLEG): container finished" podID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerID="d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6" exitCode=0 Jan 29 16:58:07 crc kubenswrapper[4886]: I0129 16:58:07.037451 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qvtg" event={"ID":"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d","Type":"ContainerDied","Data":"d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6"} Jan 29 16:58:08 crc kubenswrapper[4886]: I0129 16:58:08.047009 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qvtg" event={"ID":"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d","Type":"ContainerStarted","Data":"34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c"} Jan 29 16:58:08 crc kubenswrapper[4886]: I0129 16:58:08.078886 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2qvtg" podStartSLOduration=2.674614074 podStartE2EDuration="5.078868025s" podCreationTimestamp="2026-01-29 16:58:03 +0000 UTC" firstStartedPulling="2026-01-29 16:58:05.022468011 +0000 UTC m=+2167.931187303" lastFinishedPulling="2026-01-29 16:58:07.426721982 +0000 UTC m=+2170.335441254" observedRunningTime="2026-01-29 16:58:08.077152716 +0000 UTC m=+2170.985871998" watchObservedRunningTime="2026-01-29 16:58:08.078868025 +0000 UTC m=+2170.987587297" Jan 29 16:58:11 crc kubenswrapper[4886]: E0129 16:58:11.208457 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:58:12 crc kubenswrapper[4886]: I0129 16:58:12.078439 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" event={"ID":"a88b1900-1763-4d6c-9b3a-62598ab57eda","Type":"ContainerStarted","Data":"383e810f953da73560beb294b0cc4e1ff2ce27a83a172970f5d32c3574834f4b"} Jan 29 16:58:12 crc kubenswrapper[4886]: I0129 16:58:12.079155 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:58:12 crc kubenswrapper[4886]: I0129 16:58:12.099721 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" podStartSLOduration=1.863078996 podStartE2EDuration="19.099700809s" podCreationTimestamp="2026-01-29 16:57:53 +0000 UTC" firstStartedPulling="2026-01-29 16:57:54.059851212 +0000 UTC m=+2156.968570484" lastFinishedPulling="2026-01-29 16:58:11.296473025 +0000 UTC m=+2174.205192297" observedRunningTime="2026-01-29 16:58:12.097477797 +0000 UTC m=+2175.006197069" watchObservedRunningTime="2026-01-29 16:58:12.099700809 +0000 UTC m=+2175.008420081" Jan 29 16:58:13 crc kubenswrapper[4886]: I0129 16:58:13.723003 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:13 crc kubenswrapper[4886]: I0129 16:58:13.723310 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:13 crc kubenswrapper[4886]: I0129 16:58:13.779845 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:14 crc kubenswrapper[4886]: I0129 16:58:14.103562 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" event={"ID":"dc960811-7f19-4248-8d44-e3ffcb98d650","Type":"ContainerStarted","Data":"06b3aba9f4c6c81a562e8f7ad3f2677eb99d0965c965ef78e7a90af2bd06a456"} Jan 29 16:58:14 crc kubenswrapper[4886]: I0129 16:58:14.125051 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" podStartSLOduration=1.759393225 podStartE2EDuration="21.12503673s" podCreationTimestamp="2026-01-29 16:57:53 +0000 UTC" firstStartedPulling="2026-01-29 16:57:54.07510131 +0000 UTC m=+2156.983820582" lastFinishedPulling="2026-01-29 16:58:13.440744815 +0000 UTC m=+2176.349464087" observedRunningTime="2026-01-29 16:58:14.121761038 +0000 UTC m=+2177.030480330" watchObservedRunningTime="2026-01-29 16:58:14.12503673 +0000 UTC m=+2177.033756002" Jan 29 16:58:14 crc kubenswrapper[4886]: I0129 16:58:14.155538 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.181429 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2qvtg"] Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.181685 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2qvtg" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerName="registry-server" containerID="cri-o://34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c" gracePeriod=2 Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.614097 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.750279 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkts7\" (UniqueName: \"kubernetes.io/projected/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-kube-api-access-mkts7\") pod \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.750478 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-utilities\") pod \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.750519 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-catalog-content\") pod \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\" (UID: \"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d\") " Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.751526 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-utilities" (OuterVolumeSpecName: "utilities") pod "ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" (UID: "ae46bd6d-bdc4-4ba0-9005-feff36c3c16d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.758141 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-kube-api-access-mkts7" (OuterVolumeSpecName: "kube-api-access-mkts7") pod "ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" (UID: "ae46bd6d-bdc4-4ba0-9005-feff36c3c16d"). InnerVolumeSpecName "kube-api-access-mkts7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.853185 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:58:16 crc kubenswrapper[4886]: I0129 16:58:16.853235 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkts7\" (UniqueName: \"kubernetes.io/projected/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-kube-api-access-mkts7\") on node \"crc\" DevicePath \"\"" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.136694 4886 generic.go:334] "Generic (PLEG): container finished" podID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerID="34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c" exitCode=0 Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.136774 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qvtg" event={"ID":"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d","Type":"ContainerDied","Data":"34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c"} Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.137017 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qvtg" event={"ID":"ae46bd6d-bdc4-4ba0-9005-feff36c3c16d","Type":"ContainerDied","Data":"8c993bdc28773309bb449abbfde8c5ecf5591ad63dba75097c127b6a364bd347"} Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.137041 4886 scope.go:117] "RemoveContainer" containerID="34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.136796 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qvtg" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.180833 4886 scope.go:117] "RemoveContainer" containerID="d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.197794 4886 scope.go:117] "RemoveContainer" containerID="9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.215760 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" (UID: "ae46bd6d-bdc4-4ba0-9005-feff36c3c16d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.221598 4886 scope.go:117] "RemoveContainer" containerID="34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c" Jan 29 16:58:17 crc kubenswrapper[4886]: E0129 16:58:17.222078 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c\": container with ID starting with 34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c not found: ID does not exist" containerID="34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.222124 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c"} err="failed to get container status \"34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c\": rpc error: code = NotFound desc = could not find container \"34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c\": container with ID starting with 34bec3fb008591a00cbb8bda1d6bd98382aaf3a48fac7e2f9b7190d802e34a6c not found: ID does not exist" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.222158 4886 scope.go:117] "RemoveContainer" containerID="d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6" Jan 29 16:58:17 crc kubenswrapper[4886]: E0129 16:58:17.222732 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6\": container with ID starting with d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6 not found: ID does not exist" containerID="d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.222766 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6"} err="failed to get container status \"d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6\": rpc error: code = NotFound desc = could not find container \"d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6\": container with ID starting with d088fe64c24645b26688c050b3ff9ba12fd160541a93430d870f7b94139713f6 not found: ID does not exist" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.222803 4886 scope.go:117] "RemoveContainer" containerID="9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a" Jan 29 16:58:17 crc kubenswrapper[4886]: E0129 16:58:17.223110 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a\": container with ID starting with 9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a not found: ID does not exist" containerID="9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.223148 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a"} err="failed to get container status \"9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a\": rpc error: code = NotFound desc = could not find container \"9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a\": container with ID starting with 9f0050564609cc0eca08e69957d66f1e81d2ea75d50e8f8d88f203014bf5732a not found: ID does not exist" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.260084 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.471975 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2qvtg"] Jan 29 16:58:17 crc kubenswrapper[4886]: I0129 16:58:17.478194 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2qvtg"] Jan 29 16:58:18 crc kubenswrapper[4886]: I0129 16:58:18.629771 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" path="/var/lib/kubelet/pods/ae46bd6d-bdc4-4ba0-9005-feff36c3c16d/volumes" Jan 29 16:58:23 crc kubenswrapper[4886]: I0129 16:58:23.413983 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:58:23 crc kubenswrapper[4886]: I0129 16:58:23.725101 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-96d4668dd-sb2zt" Jan 29 16:58:24 crc kubenswrapper[4886]: E0129 16:58:24.618309 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:58:37 crc kubenswrapper[4886]: E0129 16:58:37.617556 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:58:43 crc kubenswrapper[4886]: I0129 16:58:43.417030 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-77cfddbbb9-wbb7k" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.120359 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-b4pt6"] Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.120831 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerName="extract-content" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.120847 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerName="extract-content" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.120879 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerName="registry-server" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.120885 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerName="registry-server" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.120898 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerName="extract-utilities" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.120904 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerName="extract-utilities" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.121038 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae46bd6d-bdc4-4ba0-9005-feff36c3c16d" containerName="registry-server" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.123458 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.126619 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.127442 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.128022 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qtxz2" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.130078 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w"] Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.131311 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.139111 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.143784 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w"] Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189198 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-conf\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189257 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-reloader\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189276 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics-certs\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189310 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg57n\" (UniqueName: \"kubernetes.io/projected/cf3feb5c-d348-4c0a-95c7-46f18db4687c-kube-api-access-hg57n\") pod \"frr-k8s-webhook-server-7df86c4f6c-x455w\" (UID: \"cf3feb5c-d348-4c0a-95c7-46f18db4687c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189401 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-startup\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189416 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-sockets\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189436 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npq8r\" (UniqueName: \"kubernetes.io/projected/daa4e7b8-3078-4fd1-bb04-5185fa474080-kube-api-access-npq8r\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189450 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-x455w\" (UID: \"cf3feb5c-d348-4c0a-95c7-46f18db4687c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.189481 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.220391 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bmwgt"] Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.221760 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.225281 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-dx2wk" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.225641 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.225927 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.226163 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.236618 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-tlnpb"] Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.238020 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.254037 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.261590 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-tlnpb"] Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290415 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metallb-excludel2\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290477 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-conf\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290514 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290555 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-reloader\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290657 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metrics-certs\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290785 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics-certs\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290909 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg57n\" (UniqueName: \"kubernetes.io/projected/cf3feb5c-d348-4c0a-95c7-46f18db4687c-kube-api-access-hg57n\") pod \"frr-k8s-webhook-server-7df86c4f6c-x455w\" (UID: \"cf3feb5c-d348-4c0a-95c7-46f18db4687c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290991 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-conf\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.290997 4886 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.291048 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-startup\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.291090 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics-certs podName:daa4e7b8-3078-4fd1-bb04-5185fa474080 nodeName:}" failed. No retries permitted until 2026-01-29 16:58:44.791068105 +0000 UTC m=+2207.699787377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics-certs") pod "frr-k8s-b4pt6" (UID: "daa4e7b8-3078-4fd1-bb04-5185fa474080") : secret "frr-k8s-certs-secret" not found Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.291110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-sockets\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.291191 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-x455w\" (UID: \"cf3feb5c-d348-4c0a-95c7-46f18db4687c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.291213 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npq8r\" (UniqueName: \"kubernetes.io/projected/daa4e7b8-3078-4fd1-bb04-5185fa474080-kube-api-access-npq8r\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.291276 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.290921 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-reloader\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.291373 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbc6\" (UniqueName: \"kubernetes.io/projected/5fe12a1b-277f-429e-a6b8-a874ec6e4918-kube-api-access-5qbc6\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.291476 4886 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.291485 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-sockets\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.291649 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert podName:cf3feb5c-d348-4c0a-95c7-46f18db4687c nodeName:}" failed. No retries permitted until 2026-01-29 16:58:44.791640751 +0000 UTC m=+2207.700360023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert") pod "frr-k8s-webhook-server-7df86c4f6c-x455w" (UID: "cf3feb5c-d348-4c0a-95c7-46f18db4687c") : secret "frr-k8s-webhook-server-cert" not found Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.291930 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.292244 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/daa4e7b8-3078-4fd1-bb04-5185fa474080-frr-startup\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.315256 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg57n\" (UniqueName: \"kubernetes.io/projected/cf3feb5c-d348-4c0a-95c7-46f18db4687c-kube-api-access-hg57n\") pod \"frr-k8s-webhook-server-7df86c4f6c-x455w\" (UID: \"cf3feb5c-d348-4c0a-95c7-46f18db4687c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.315987 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npq8r\" (UniqueName: \"kubernetes.io/projected/daa4e7b8-3078-4fd1-bb04-5185fa474080-kube-api-access-npq8r\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.392779 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metallb-excludel2\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.392859 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.392891 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metrics-certs\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.392932 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/946b39e6-3f42-4aff-a197-f29de26c175a-metrics-certs\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.392982 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/946b39e6-3f42-4aff-a197-f29de26c175a-cert\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.393072 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26txp\" (UniqueName: \"kubernetes.io/projected/946b39e6-3f42-4aff-a197-f29de26c175a-kube-api-access-26txp\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.393110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbc6\" (UniqueName: \"kubernetes.io/projected/5fe12a1b-277f-429e-a6b8-a874ec6e4918-kube-api-access-5qbc6\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.393537 4886 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.393588 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist podName:5fe12a1b-277f-429e-a6b8-a874ec6e4918 nodeName:}" failed. No retries permitted until 2026-01-29 16:58:44.893573046 +0000 UTC m=+2207.802292318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist") pod "speaker-bmwgt" (UID: "5fe12a1b-277f-429e-a6b8-a874ec6e4918") : secret "metallb-memberlist" not found Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.393689 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metallb-excludel2\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.393750 4886 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.393779 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metrics-certs podName:5fe12a1b-277f-429e-a6b8-a874ec6e4918 nodeName:}" failed. No retries permitted until 2026-01-29 16:58:44.893770722 +0000 UTC m=+2207.802490114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metrics-certs") pod "speaker-bmwgt" (UID: "5fe12a1b-277f-429e-a6b8-a874ec6e4918") : secret "speaker-certs-secret" not found Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.422177 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbc6\" (UniqueName: \"kubernetes.io/projected/5fe12a1b-277f-429e-a6b8-a874ec6e4918-kube-api-access-5qbc6\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.494610 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26txp\" (UniqueName: \"kubernetes.io/projected/946b39e6-3f42-4aff-a197-f29de26c175a-kube-api-access-26txp\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.494759 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/946b39e6-3f42-4aff-a197-f29de26c175a-metrics-certs\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.494793 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/946b39e6-3f42-4aff-a197-f29de26c175a-cert\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.496690 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.498582 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/946b39e6-3f42-4aff-a197-f29de26c175a-metrics-certs\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.508097 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/946b39e6-3f42-4aff-a197-f29de26c175a-cert\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.518505 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26txp\" (UniqueName: \"kubernetes.io/projected/946b39e6-3f42-4aff-a197-f29de26c175a-kube-api-access-26txp\") pod \"controller-6968d8fdc4-tlnpb\" (UID: \"946b39e6-3f42-4aff-a197-f29de26c175a\") " pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.554041 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.804319 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-x455w\" (UID: \"cf3feb5c-d348-4c0a-95c7-46f18db4687c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.804871 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics-certs\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.808450 4886 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.808525 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert podName:cf3feb5c-d348-4c0a-95c7-46f18db4687c nodeName:}" failed. No retries permitted until 2026-01-29 16:58:45.808506648 +0000 UTC m=+2208.717225920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert") pod "frr-k8s-webhook-server-7df86c4f6c-x455w" (UID: "cf3feb5c-d348-4c0a-95c7-46f18db4687c") : secret "frr-k8s-webhook-server-cert" not found Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.815676 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/daa4e7b8-3078-4fd1-bb04-5185fa474080-metrics-certs\") pod \"frr-k8s-b4pt6\" (UID: \"daa4e7b8-3078-4fd1-bb04-5185fa474080\") " pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.819450 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-tlnpb"] Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.906208 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.906269 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metrics-certs\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.906418 4886 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 29 16:58:44 crc kubenswrapper[4886]: E0129 16:58:44.906499 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist podName:5fe12a1b-277f-429e-a6b8-a874ec6e4918 nodeName:}" failed. No retries permitted until 2026-01-29 16:58:45.906478912 +0000 UTC m=+2208.815198224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist") pod "speaker-bmwgt" (UID: "5fe12a1b-277f-429e-a6b8-a874ec6e4918") : secret "metallb-memberlist" not found Jan 29 16:58:44 crc kubenswrapper[4886]: I0129 16:58:44.911300 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-metrics-certs\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.042856 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.338240 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-tlnpb" event={"ID":"946b39e6-3f42-4aff-a197-f29de26c175a","Type":"ContainerStarted","Data":"90a7c096c5a920388f8b7a677acf53ff14f8d6e4ed7b994189b0d652dd1c845a"} Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.338274 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-tlnpb" event={"ID":"946b39e6-3f42-4aff-a197-f29de26c175a","Type":"ContainerStarted","Data":"a46d1d43cbb11249a72b7972494c82c2ee3869c566c7845a394db3f2044bf07a"} Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.338283 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-tlnpb" event={"ID":"946b39e6-3f42-4aff-a197-f29de26c175a","Type":"ContainerStarted","Data":"14035c49466ef64a1830ce88769cabcc33c33c4d4eb6cbcf988c66dc62f5e237"} Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.338305 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.339275 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerStarted","Data":"e06ea4729f480338d99865a7d8bba134df9367281d355191e7b099f5804ad529"} Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.361175 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-tlnpb" podStartSLOduration=1.361152583 podStartE2EDuration="1.361152583s" podCreationTimestamp="2026-01-29 16:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:58:45.355647438 +0000 UTC m=+2208.264366710" watchObservedRunningTime="2026-01-29 16:58:45.361152583 +0000 UTC m=+2208.269871855" Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.821654 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-x455w\" (UID: \"cf3feb5c-d348-4c0a-95c7-46f18db4687c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.827588 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3feb5c-d348-4c0a-95c7-46f18db4687c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-x455w\" (UID: \"cf3feb5c-d348-4c0a-95c7-46f18db4687c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.923792 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.932704 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5fe12a1b-277f-429e-a6b8-a874ec6e4918-memberlist\") pod \"speaker-bmwgt\" (UID: \"5fe12a1b-277f-429e-a6b8-a874ec6e4918\") " pod="metallb-system/speaker-bmwgt" Jan 29 16:58:45 crc kubenswrapper[4886]: I0129 16:58:45.956612 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:46 crc kubenswrapper[4886]: I0129 16:58:46.041007 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bmwgt" Jan 29 16:58:46 crc kubenswrapper[4886]: I0129 16:58:46.347766 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bmwgt" event={"ID":"5fe12a1b-277f-429e-a6b8-a874ec6e4918","Type":"ContainerStarted","Data":"35142a1b6f288abc0ba405b57207c5e5432fb9b2dea12b9cab7fe98330e632fc"} Jan 29 16:58:46 crc kubenswrapper[4886]: I0129 16:58:46.425926 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w"] Jan 29 16:58:47 crc kubenswrapper[4886]: I0129 16:58:47.356695 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bmwgt" event={"ID":"5fe12a1b-277f-429e-a6b8-a874ec6e4918","Type":"ContainerStarted","Data":"b53067bd49090ab3d385aa12303839c8e5c71a3df115717d83b06d14d270017a"} Jan 29 16:58:47 crc kubenswrapper[4886]: I0129 16:58:47.356997 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bmwgt" Jan 29 16:58:47 crc kubenswrapper[4886]: I0129 16:58:47.357008 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bmwgt" event={"ID":"5fe12a1b-277f-429e-a6b8-a874ec6e4918","Type":"ContainerStarted","Data":"032bc1216d2da6c5bf637ba863a902307edea9aad036b24c6a8eaaeb30a8233a"} Jan 29 16:58:47 crc kubenswrapper[4886]: I0129 16:58:47.357909 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" event={"ID":"cf3feb5c-d348-4c0a-95c7-46f18db4687c","Type":"ContainerStarted","Data":"3c2a2e57cd9f8d3c302221dc22b2b96ce896d9e8b852e3c8adbb7972202481b5"} Jan 29 16:58:48 crc kubenswrapper[4886]: I0129 16:58:48.637105 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bmwgt" podStartSLOduration=4.637059134 podStartE2EDuration="4.637059134s" podCreationTimestamp="2026-01-29 16:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 16:58:47.37749308 +0000 UTC m=+2210.286212352" watchObservedRunningTime="2026-01-29 16:58:48.637059134 +0000 UTC m=+2211.545778406" Jan 29 16:58:49 crc kubenswrapper[4886]: E0129 16:58:49.628567 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" Jan 29 16:58:54 crc kubenswrapper[4886]: I0129 16:58:54.417950 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" event={"ID":"cf3feb5c-d348-4c0a-95c7-46f18db4687c","Type":"ContainerStarted","Data":"679ea7191cf4d24a40aab69fd3b514e325f6feaeb59a810116b9ebd2cc7deaf6"} Jan 29 16:58:54 crc kubenswrapper[4886]: I0129 16:58:54.418591 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:58:54 crc kubenswrapper[4886]: I0129 16:58:54.433522 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" podStartSLOduration=2.873191484 podStartE2EDuration="10.433506844s" podCreationTimestamp="2026-01-29 16:58:44 +0000 UTC" firstStartedPulling="2026-01-29 16:58:46.44006294 +0000 UTC m=+2209.348782212" lastFinishedPulling="2026-01-29 16:58:54.00037829 +0000 UTC m=+2216.909097572" observedRunningTime="2026-01-29 16:58:54.432161387 +0000 UTC m=+2217.340880649" watchObservedRunningTime="2026-01-29 16:58:54.433506844 +0000 UTC m=+2217.342226116" Jan 29 16:58:54 crc kubenswrapper[4886]: I0129 16:58:54.557127 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-tlnpb" Jan 29 16:58:56 crc kubenswrapper[4886]: I0129 16:58:56.044932 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bmwgt" Jan 29 16:58:57 crc kubenswrapper[4886]: I0129 16:58:57.440792 4886 generic.go:334] "Generic (PLEG): container finished" podID="daa4e7b8-3078-4fd1-bb04-5185fa474080" containerID="21a0b606d61a6f6e26359e90b9f7f02797b091203f85dbe1eddb9a5153dee23b" exitCode=0 Jan 29 16:58:57 crc kubenswrapper[4886]: I0129 16:58:57.440853 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerDied","Data":"21a0b606d61a6f6e26359e90b9f7f02797b091203f85dbe1eddb9a5153dee23b"} Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.450254 4886 generic.go:334] "Generic (PLEG): container finished" podID="daa4e7b8-3078-4fd1-bb04-5185fa474080" containerID="25827c5b02bfd7316f2f248eed60d598e7cf7efa786c464135e6dbd21e55a8a1" exitCode=0 Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.450299 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerDied","Data":"25827c5b02bfd7316f2f248eed60d598e7cf7efa786c464135e6dbd21e55a8a1"} Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.746079 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qnwgz"] Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.747710 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qnwgz" Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.751922 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.753486 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-lr84b" Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.755211 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.766979 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qnwgz"] Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.834944 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrd5z\" (UniqueName: \"kubernetes.io/projected/ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2-kube-api-access-lrd5z\") pod \"openstack-operator-index-qnwgz\" (UID: \"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2\") " pod="openstack-operators/openstack-operator-index-qnwgz" Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.936034 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrd5z\" (UniqueName: \"kubernetes.io/projected/ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2-kube-api-access-lrd5z\") pod \"openstack-operator-index-qnwgz\" (UID: \"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2\") " pod="openstack-operators/openstack-operator-index-qnwgz" Jan 29 16:58:58 crc kubenswrapper[4886]: I0129 16:58:58.956174 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrd5z\" (UniqueName: \"kubernetes.io/projected/ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2-kube-api-access-lrd5z\") pod \"openstack-operator-index-qnwgz\" (UID: \"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2\") " pod="openstack-operators/openstack-operator-index-qnwgz" Jan 29 16:58:59 crc kubenswrapper[4886]: I0129 16:58:59.076909 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qnwgz" Jan 29 16:58:59 crc kubenswrapper[4886]: I0129 16:58:59.466900 4886 generic.go:334] "Generic (PLEG): container finished" podID="daa4e7b8-3078-4fd1-bb04-5185fa474080" containerID="acb129ab0206aca82377f91455ef6b325a4f1c3434d95c34a20c88225efd4c3d" exitCode=0 Jan 29 16:58:59 crc kubenswrapper[4886]: I0129 16:58:59.466996 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerDied","Data":"acb129ab0206aca82377f91455ef6b325a4f1c3434d95c34a20c88225efd4c3d"} Jan 29 16:58:59 crc kubenswrapper[4886]: I0129 16:58:59.483174 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qnwgz"] Jan 29 16:59:00 crc kubenswrapper[4886]: I0129 16:59:00.476131 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qnwgz" event={"ID":"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2","Type":"ContainerStarted","Data":"ee5a97170b8f7a7e021d72b474ef4b841031a4d1f2600ead3a0c2d42211558e6"} Jan 29 16:59:00 crc kubenswrapper[4886]: I0129 16:59:00.479587 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerStarted","Data":"4203e14a2ac44c65e3cc097c8472d981365bf56aa09b734316f98f7b8be42d92"} Jan 29 16:59:00 crc kubenswrapper[4886]: I0129 16:59:00.479640 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerStarted","Data":"75f3a507fe6d1f628a92fc5718710f0607717ba847441497f169ee297b0a6694"} Jan 29 16:59:00 crc kubenswrapper[4886]: I0129 16:59:00.479658 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerStarted","Data":"63cc1e42952fa8b15a3d002ad4ffa98bda98d1621b5eefcee2604097c29d2b66"} Jan 29 16:59:01 crc kubenswrapper[4886]: I0129 16:59:01.493818 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerStarted","Data":"b1a7f010389fbb8f26d68585c70e110e3a0ae726f93f2cc2ac75c9567a80bb2f"} Jan 29 16:59:01 crc kubenswrapper[4886]: I0129 16:59:01.739007 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qnwgz"] Jan 29 16:59:02 crc kubenswrapper[4886]: I0129 16:59:02.332792 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ddcl7"] Jan 29 16:59:02 crc kubenswrapper[4886]: I0129 16:59:02.335018 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:02 crc kubenswrapper[4886]: I0129 16:59:02.354339 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ddcl7"] Jan 29 16:59:02 crc kubenswrapper[4886]: I0129 16:59:02.392510 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjddc\" (UniqueName: \"kubernetes.io/projected/9b2b35ba-9f49-4dd6-816d-6acc4e54e514-kube-api-access-mjddc\") pod \"openstack-operator-index-ddcl7\" (UID: \"9b2b35ba-9f49-4dd6-816d-6acc4e54e514\") " pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:02 crc kubenswrapper[4886]: I0129 16:59:02.493647 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjddc\" (UniqueName: \"kubernetes.io/projected/9b2b35ba-9f49-4dd6-816d-6acc4e54e514-kube-api-access-mjddc\") pod \"openstack-operator-index-ddcl7\" (UID: \"9b2b35ba-9f49-4dd6-816d-6acc4e54e514\") " pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:02 crc kubenswrapper[4886]: I0129 16:59:02.535278 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjddc\" (UniqueName: \"kubernetes.io/projected/9b2b35ba-9f49-4dd6-816d-6acc4e54e514-kube-api-access-mjddc\") pod \"openstack-operator-index-ddcl7\" (UID: \"9b2b35ba-9f49-4dd6-816d-6acc4e54e514\") " pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:02 crc kubenswrapper[4886]: I0129 16:59:02.654938 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:03 crc kubenswrapper[4886]: I0129 16:59:03.556991 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerStarted","Data":"e38c6d3208f1788c5bfe3f357bef3c7c2a8bead2f458bdb21881d45d2fbb1f99"} Jan 29 16:59:03 crc kubenswrapper[4886]: I0129 16:59:03.654579 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ddcl7"] Jan 29 16:59:04 crc kubenswrapper[4886]: I0129 16:59:04.573191 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4pt6" event={"ID":"daa4e7b8-3078-4fd1-bb04-5185fa474080","Type":"ContainerStarted","Data":"4a04d113b40bcf9ea2910cae42f1486fc968739032f04351da7b23a47184f7d1"} Jan 29 16:59:04 crc kubenswrapper[4886]: I0129 16:59:04.573585 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:59:04 crc kubenswrapper[4886]: I0129 16:59:04.622300 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-b4pt6" podStartSLOduration=9.140252572 podStartE2EDuration="20.622271226s" podCreationTimestamp="2026-01-29 16:58:44 +0000 UTC" firstStartedPulling="2026-01-29 16:58:45.195024463 +0000 UTC m=+2208.103743735" lastFinishedPulling="2026-01-29 16:58:56.677043117 +0000 UTC m=+2219.585762389" observedRunningTime="2026-01-29 16:59:04.616416012 +0000 UTC m=+2227.525135324" watchObservedRunningTime="2026-01-29 16:59:04.622271226 +0000 UTC m=+2227.530990518" Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.043845 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.098515 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.583492 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ddcl7" event={"ID":"9b2b35ba-9f49-4dd6-816d-6acc4e54e514","Type":"ContainerStarted","Data":"ee24296722ba86dce412919d1af258d029c8c32cfa4628ead2f77068d6c1ed4f"} Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.583828 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ddcl7" event={"ID":"9b2b35ba-9f49-4dd6-816d-6acc4e54e514","Type":"ContainerStarted","Data":"30f7e589be86b09188a992f305fc1177bcd24a4fc997eea3ff9f03b9b9cb6b77"} Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.585482 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-qnwgz" podUID="ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2" containerName="registry-server" containerID="cri-o://a0b214de91e150b6b957d8f5429ccb90584e319fc888745b1be949d8551e92d4" gracePeriod=2 Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.585893 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qnwgz" event={"ID":"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2","Type":"ContainerStarted","Data":"a0b214de91e150b6b957d8f5429ccb90584e319fc888745b1be949d8551e92d4"} Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.604053 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ddcl7" podStartSLOduration=3.193870964 podStartE2EDuration="3.604033773s" podCreationTimestamp="2026-01-29 16:59:02 +0000 UTC" firstStartedPulling="2026-01-29 16:59:04.762775246 +0000 UTC m=+2227.671494518" lastFinishedPulling="2026-01-29 16:59:05.172938055 +0000 UTC m=+2228.081657327" observedRunningTime="2026-01-29 16:59:05.602707335 +0000 UTC m=+2228.511426627" watchObservedRunningTime="2026-01-29 16:59:05.604033773 +0000 UTC m=+2228.512753045" Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.624282 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qnwgz" podStartSLOduration=2.349477144 podStartE2EDuration="7.624263111s" podCreationTimestamp="2026-01-29 16:58:58 +0000 UTC" firstStartedPulling="2026-01-29 16:58:59.499215124 +0000 UTC m=+2222.407934396" lastFinishedPulling="2026-01-29 16:59:04.774001091 +0000 UTC m=+2227.682720363" observedRunningTime="2026-01-29 16:59:05.619847367 +0000 UTC m=+2228.528566649" watchObservedRunningTime="2026-01-29 16:59:05.624263111 +0000 UTC m=+2228.532982383" Jan 29 16:59:05 crc kubenswrapper[4886]: I0129 16:59:05.962556 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x455w" Jan 29 16:59:09 crc kubenswrapper[4886]: I0129 16:59:09.077215 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qnwgz" Jan 29 16:59:09 crc kubenswrapper[4886]: I0129 16:59:09.628134 4886 generic.go:334] "Generic (PLEG): container finished" podID="ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2" containerID="a0b214de91e150b6b957d8f5429ccb90584e319fc888745b1be949d8551e92d4" exitCode=0 Jan 29 16:59:09 crc kubenswrapper[4886]: I0129 16:59:09.628228 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qnwgz" event={"ID":"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2","Type":"ContainerDied","Data":"a0b214de91e150b6b957d8f5429ccb90584e319fc888745b1be949d8551e92d4"} Jan 29 16:59:09 crc kubenswrapper[4886]: I0129 16:59:09.811153 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qnwgz" Jan 29 16:59:09 crc kubenswrapper[4886]: I0129 16:59:09.920122 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrd5z\" (UniqueName: \"kubernetes.io/projected/ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2-kube-api-access-lrd5z\") pod \"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2\" (UID: \"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2\") " Jan 29 16:59:09 crc kubenswrapper[4886]: I0129 16:59:09.925676 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2-kube-api-access-lrd5z" (OuterVolumeSpecName: "kube-api-access-lrd5z") pod "ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2" (UID: "ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2"). InnerVolumeSpecName "kube-api-access-lrd5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:59:10 crc kubenswrapper[4886]: I0129 16:59:10.021969 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrd5z\" (UniqueName: \"kubernetes.io/projected/ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2-kube-api-access-lrd5z\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:10 crc kubenswrapper[4886]: I0129 16:59:10.643492 4886 generic.go:334] "Generic (PLEG): container finished" podID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerID="ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657" exitCode=0 Jan 29 16:59:10 crc kubenswrapper[4886]: I0129 16:59:10.643558 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4fv5" event={"ID":"3e333f39-f93b-4066-8e9f-4bd27e4d3672","Type":"ContainerDied","Data":"ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657"} Jan 29 16:59:10 crc kubenswrapper[4886]: I0129 16:59:10.647830 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qnwgz" event={"ID":"ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2","Type":"ContainerDied","Data":"ee5a97170b8f7a7e021d72b474ef4b841031a4d1f2600ead3a0c2d42211558e6"} Jan 29 16:59:10 crc kubenswrapper[4886]: I0129 16:59:10.647908 4886 scope.go:117] "RemoveContainer" containerID="a0b214de91e150b6b957d8f5429ccb90584e319fc888745b1be949d8551e92d4" Jan 29 16:59:10 crc kubenswrapper[4886]: I0129 16:59:10.647970 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qnwgz" Jan 29 16:59:10 crc kubenswrapper[4886]: I0129 16:59:10.721054 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qnwgz"] Jan 29 16:59:10 crc kubenswrapper[4886]: I0129 16:59:10.732481 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-qnwgz"] Jan 29 16:59:12 crc kubenswrapper[4886]: I0129 16:59:12.625199 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2" path="/var/lib/kubelet/pods/ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2/volumes" Jan 29 16:59:12 crc kubenswrapper[4886]: I0129 16:59:12.656231 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:12 crc kubenswrapper[4886]: I0129 16:59:12.656275 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:12 crc kubenswrapper[4886]: I0129 16:59:12.684872 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:12 crc kubenswrapper[4886]: I0129 16:59:12.712027 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ddcl7" Jan 29 16:59:14 crc kubenswrapper[4886]: I0129 16:59:14.682853 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4fv5" event={"ID":"3e333f39-f93b-4066-8e9f-4bd27e4d3672","Type":"ContainerStarted","Data":"9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1"} Jan 29 16:59:14 crc kubenswrapper[4886]: I0129 16:59:14.714748 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m4fv5" podStartSLOduration=2.964061207 podStartE2EDuration="2m53.714733031s" podCreationTimestamp="2026-01-29 16:56:21 +0000 UTC" firstStartedPulling="2026-01-29 16:56:23.12173226 +0000 UTC m=+2066.030451532" lastFinishedPulling="2026-01-29 16:59:13.872404084 +0000 UTC m=+2236.781123356" observedRunningTime="2026-01-29 16:59:14.713858356 +0000 UTC m=+2237.622577618" watchObservedRunningTime="2026-01-29 16:59:14.714733031 +0000 UTC m=+2237.623452303" Jan 29 16:59:15 crc kubenswrapper[4886]: I0129 16:59:15.068720 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-b4pt6" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.183179 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp"] Jan 29 16:59:18 crc kubenswrapper[4886]: E0129 16:59:18.183906 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2" containerName="registry-server" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.183921 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2" containerName="registry-server" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.184128 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf6312e-f5f4-4cdf-89f6-eca0052b4ce2" containerName="registry-server" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.185439 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.188397 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-m8266" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.203400 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp"] Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.386790 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxz9\" (UniqueName: \"kubernetes.io/projected/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-kube-api-access-qdxz9\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.387008 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-util\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.387080 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-bundle\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.488455 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-bundle\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.488585 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxz9\" (UniqueName: \"kubernetes.io/projected/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-kube-api-access-qdxz9\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.488772 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-util\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.488967 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-bundle\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.489247 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-util\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.533728 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxz9\" (UniqueName: \"kubernetes.io/projected/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-kube-api-access-qdxz9\") pod \"39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:18 crc kubenswrapper[4886]: I0129 16:59:18.810736 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:19 crc kubenswrapper[4886]: W0129 16:59:19.277929 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5eb87e5_9a66_4bf3_8348_1dc03c7e0e8e.slice/crio-fb2914aa8dc2c93108ec5ed30e7b3b77724878da0cb11ac6bf0a6c92f19837f6 WatchSource:0}: Error finding container fb2914aa8dc2c93108ec5ed30e7b3b77724878da0cb11ac6bf0a6c92f19837f6: Status 404 returned error can't find the container with id fb2914aa8dc2c93108ec5ed30e7b3b77724878da0cb11ac6bf0a6c92f19837f6 Jan 29 16:59:19 crc kubenswrapper[4886]: I0129 16:59:19.282416 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp"] Jan 29 16:59:19 crc kubenswrapper[4886]: I0129 16:59:19.724920 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" event={"ID":"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e","Type":"ContainerStarted","Data":"fb2914aa8dc2c93108ec5ed30e7b3b77724878da0cb11ac6bf0a6c92f19837f6"} Jan 29 16:59:20 crc kubenswrapper[4886]: I0129 16:59:20.733637 4886 generic.go:334] "Generic (PLEG): container finished" podID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerID="f99f0442ec3925d7cfe1e552bc529dd0f7264a1bd5daec05d7d50b14d01e3241" exitCode=0 Jan 29 16:59:20 crc kubenswrapper[4886]: I0129 16:59:20.733675 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" event={"ID":"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e","Type":"ContainerDied","Data":"f99f0442ec3925d7cfe1e552bc529dd0f7264a1bd5daec05d7d50b14d01e3241"} Jan 29 16:59:22 crc kubenswrapper[4886]: I0129 16:59:22.129724 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:59:22 crc kubenswrapper[4886]: I0129 16:59:22.129799 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:59:22 crc kubenswrapper[4886]: I0129 16:59:22.190443 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:59:22 crc kubenswrapper[4886]: I0129 16:59:22.839743 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:59:23 crc kubenswrapper[4886]: I0129 16:59:23.530003 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4fv5"] Jan 29 16:59:24 crc kubenswrapper[4886]: I0129 16:59:24.776371 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m4fv5" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerName="registry-server" containerID="cri-o://9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1" gracePeriod=2 Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.513705 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.608209 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kfqs\" (UniqueName: \"kubernetes.io/projected/3e333f39-f93b-4066-8e9f-4bd27e4d3672-kube-api-access-6kfqs\") pod \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.608500 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-catalog-content\") pod \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.608584 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-utilities\") pod \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\" (UID: \"3e333f39-f93b-4066-8e9f-4bd27e4d3672\") " Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.609265 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-utilities" (OuterVolumeSpecName: "utilities") pod "3e333f39-f93b-4066-8e9f-4bd27e4d3672" (UID: "3e333f39-f93b-4066-8e9f-4bd27e4d3672"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.614672 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e333f39-f93b-4066-8e9f-4bd27e4d3672-kube-api-access-6kfqs" (OuterVolumeSpecName: "kube-api-access-6kfqs") pod "3e333f39-f93b-4066-8e9f-4bd27e4d3672" (UID: "3e333f39-f93b-4066-8e9f-4bd27e4d3672"). InnerVolumeSpecName "kube-api-access-6kfqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.642116 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e333f39-f93b-4066-8e9f-4bd27e4d3672" (UID: "3e333f39-f93b-4066-8e9f-4bd27e4d3672"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.711383 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kfqs\" (UniqueName: \"kubernetes.io/projected/3e333f39-f93b-4066-8e9f-4bd27e4d3672-kube-api-access-6kfqs\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.711412 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.711427 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e333f39-f93b-4066-8e9f-4bd27e4d3672-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.788454 4886 generic.go:334] "Generic (PLEG): container finished" podID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerID="9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1" exitCode=0 Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.788528 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4fv5" event={"ID":"3e333f39-f93b-4066-8e9f-4bd27e4d3672","Type":"ContainerDied","Data":"9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1"} Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.788559 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4fv5" event={"ID":"3e333f39-f93b-4066-8e9f-4bd27e4d3672","Type":"ContainerDied","Data":"721f687c812954ac213bf098f41dc7b5630da2bcf0b09ba3c2bdd27881939e63"} Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.788575 4886 scope.go:117] "RemoveContainer" containerID="9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.788581 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4fv5" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.792319 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" event={"ID":"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e","Type":"ContainerStarted","Data":"ec42cdfc44ca840cbb4fad62e8838ff084ffff3f56e86e59dc4375f0d43ac3af"} Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.809021 4886 scope.go:117] "RemoveContainer" containerID="ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.836350 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4fv5"] Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.837731 4886 scope.go:117] "RemoveContainer" containerID="54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.848091 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4fv5"] Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.857651 4886 scope.go:117] "RemoveContainer" containerID="9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1" Jan 29 16:59:25 crc kubenswrapper[4886]: E0129 16:59:25.858073 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1\": container with ID starting with 9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1 not found: ID does not exist" containerID="9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.858118 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1"} err="failed to get container status \"9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1\": rpc error: code = NotFound desc = could not find container \"9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1\": container with ID starting with 9db5ae6315c700c7878b8b6ad7193c1666b3b2cc58fcace9fc8e327a5fb5a0e1 not found: ID does not exist" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.858148 4886 scope.go:117] "RemoveContainer" containerID="ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657" Jan 29 16:59:25 crc kubenswrapper[4886]: E0129 16:59:25.858725 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657\": container with ID starting with ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657 not found: ID does not exist" containerID="ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.858747 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657"} err="failed to get container status \"ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657\": rpc error: code = NotFound desc = could not find container \"ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657\": container with ID starting with ce5c58afc7739fb2f46c85959cde9363860ba02d57d66af2be89058b5434f657 not found: ID does not exist" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.858763 4886 scope.go:117] "RemoveContainer" containerID="54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026" Jan 29 16:59:25 crc kubenswrapper[4886]: E0129 16:59:25.858982 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026\": container with ID starting with 54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026 not found: ID does not exist" containerID="54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026" Jan 29 16:59:25 crc kubenswrapper[4886]: I0129 16:59:25.859001 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026"} err="failed to get container status \"54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026\": rpc error: code = NotFound desc = could not find container \"54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026\": container with ID starting with 54c413f049295c75ea245b7bf5b81932f10621e4a5575c34da54c41a85be6026 not found: ID does not exist" Jan 29 16:59:26 crc kubenswrapper[4886]: I0129 16:59:26.642257 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" path="/var/lib/kubelet/pods/3e333f39-f93b-4066-8e9f-4bd27e4d3672/volumes" Jan 29 16:59:26 crc kubenswrapper[4886]: I0129 16:59:26.804555 4886 generic.go:334] "Generic (PLEG): container finished" podID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerID="ec42cdfc44ca840cbb4fad62e8838ff084ffff3f56e86e59dc4375f0d43ac3af" exitCode=0 Jan 29 16:59:26 crc kubenswrapper[4886]: I0129 16:59:26.804597 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" event={"ID":"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e","Type":"ContainerDied","Data":"ec42cdfc44ca840cbb4fad62e8838ff084ffff3f56e86e59dc4375f0d43ac3af"} Jan 29 16:59:27 crc kubenswrapper[4886]: I0129 16:59:27.818207 4886 generic.go:334] "Generic (PLEG): container finished" podID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerID="a9e2a8679df68561a70f930872f41fede0f43990d3a760447e1bc513acacd728" exitCode=0 Jan 29 16:59:27 crc kubenswrapper[4886]: I0129 16:59:27.818383 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" event={"ID":"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e","Type":"ContainerDied","Data":"a9e2a8679df68561a70f930872f41fede0f43990d3a760447e1bc513acacd728"} Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.193574 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.274209 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-bundle\") pod \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.274384 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdxz9\" (UniqueName: \"kubernetes.io/projected/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-kube-api-access-qdxz9\") pod \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.274543 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-util\") pod \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\" (UID: \"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e\") " Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.284590 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-kube-api-access-qdxz9" (OuterVolumeSpecName: "kube-api-access-qdxz9") pod "c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" (UID: "c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e"). InnerVolumeSpecName "kube-api-access-qdxz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.290575 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-util" (OuterVolumeSpecName: "util") pod "c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" (UID: "c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.291170 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-bundle" (OuterVolumeSpecName: "bundle") pod "c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" (UID: "c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.376775 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdxz9\" (UniqueName: \"kubernetes.io/projected/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-kube-api-access-qdxz9\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.376828 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-util\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.376848 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.660528 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.660582 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.842564 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" event={"ID":"c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e","Type":"ContainerDied","Data":"fb2914aa8dc2c93108ec5ed30e7b3b77724878da0cb11ac6bf0a6c92f19837f6"} Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.842921 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb2914aa8dc2c93108ec5ed30e7b3b77724878da0cb11ac6bf0a6c92f19837f6" Jan 29 16:59:29 crc kubenswrapper[4886]: I0129 16:59:29.842982 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.423561 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf"] Jan 29 16:59:35 crc kubenswrapper[4886]: E0129 16:59:35.424576 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerName="extract-utilities" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.424594 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerName="extract-utilities" Jan 29 16:59:35 crc kubenswrapper[4886]: E0129 16:59:35.424610 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerName="extract-content" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.424618 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerName="extract-content" Jan 29 16:59:35 crc kubenswrapper[4886]: E0129 16:59:35.424634 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerName="pull" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.424642 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerName="pull" Jan 29 16:59:35 crc kubenswrapper[4886]: E0129 16:59:35.424655 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerName="extract" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.424662 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerName="extract" Jan 29 16:59:35 crc kubenswrapper[4886]: E0129 16:59:35.424673 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerName="registry-server" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.424681 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerName="registry-server" Jan 29 16:59:35 crc kubenswrapper[4886]: E0129 16:59:35.424697 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerName="util" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.424705 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerName="util" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.424868 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e333f39-f93b-4066-8e9f-4bd27e4d3672" containerName="registry-server" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.424888 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e" containerName="extract" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.425563 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.435923 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf"] Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.444718 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-vp8fr" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.588035 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzj8t\" (UniqueName: \"kubernetes.io/projected/d4b791b8-523f-4cf0-9ec7-9283c2fd4dde-kube-api-access-dzj8t\") pod \"openstack-operator-controller-init-86bf76f8cb-r9sbf\" (UID: \"d4b791b8-523f-4cf0-9ec7-9283c2fd4dde\") " pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.689550 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzj8t\" (UniqueName: \"kubernetes.io/projected/d4b791b8-523f-4cf0-9ec7-9283c2fd4dde-kube-api-access-dzj8t\") pod \"openstack-operator-controller-init-86bf76f8cb-r9sbf\" (UID: \"d4b791b8-523f-4cf0-9ec7-9283c2fd4dde\") " pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.711782 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzj8t\" (UniqueName: \"kubernetes.io/projected/d4b791b8-523f-4cf0-9ec7-9283c2fd4dde-kube-api-access-dzj8t\") pod \"openstack-operator-controller-init-86bf76f8cb-r9sbf\" (UID: \"d4b791b8-523f-4cf0-9ec7-9283c2fd4dde\") " pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" Jan 29 16:59:35 crc kubenswrapper[4886]: I0129 16:59:35.744927 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" Jan 29 16:59:36 crc kubenswrapper[4886]: I0129 16:59:36.208892 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf"] Jan 29 16:59:36 crc kubenswrapper[4886]: I0129 16:59:36.903889 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" event={"ID":"d4b791b8-523f-4cf0-9ec7-9283c2fd4dde","Type":"ContainerStarted","Data":"ce6fb9f5ef9512b738ac1d0e983bd606f5bfc0e429cc5a338bcde6ac28bc6c37"} Jan 29 16:59:40 crc kubenswrapper[4886]: I0129 16:59:40.863518 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l6kcr"] Jan 29 16:59:40 crc kubenswrapper[4886]: I0129 16:59:40.865695 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:40 crc kubenswrapper[4886]: I0129 16:59:40.891524 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l6kcr"] Jan 29 16:59:40 crc kubenswrapper[4886]: I0129 16:59:40.988135 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-catalog-content\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:40 crc kubenswrapper[4886]: I0129 16:59:40.988182 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2tz\" (UniqueName: \"kubernetes.io/projected/ef4834a8-a534-49a1-ba4e-07543a1d73ff-kube-api-access-hp2tz\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:40 crc kubenswrapper[4886]: I0129 16:59:40.988280 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-utilities\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.089658 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-catalog-content\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.089708 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2tz\" (UniqueName: \"kubernetes.io/projected/ef4834a8-a534-49a1-ba4e-07543a1d73ff-kube-api-access-hp2tz\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.089801 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-utilities\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.090159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-catalog-content\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.090181 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-utilities\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.113393 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2tz\" (UniqueName: \"kubernetes.io/projected/ef4834a8-a534-49a1-ba4e-07543a1d73ff-kube-api-access-hp2tz\") pod \"redhat-operators-l6kcr\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.193735 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:41 crc kubenswrapper[4886]: W0129 16:59:41.656561 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef4834a8_a534_49a1_ba4e_07543a1d73ff.slice/crio-49c5189fe2f9c09dc98e6dd9490ed2837b141ee31dfec16f46c3e6f0f0ff2d94 WatchSource:0}: Error finding container 49c5189fe2f9c09dc98e6dd9490ed2837b141ee31dfec16f46c3e6f0f0ff2d94: Status 404 returned error can't find the container with id 49c5189fe2f9c09dc98e6dd9490ed2837b141ee31dfec16f46c3e6f0f0ff2d94 Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.659868 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l6kcr"] Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.960200 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" event={"ID":"d4b791b8-523f-4cf0-9ec7-9283c2fd4dde","Type":"ContainerStarted","Data":"a36a7c6e2180ec2f8bc93353d652a312e752e3911260d82dbdb2decdd7be960d"} Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.961766 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.963607 4886 generic.go:334] "Generic (PLEG): container finished" podID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerID="7f55f9c6228c0244d9e3c7e38d2569229b65bf9a7ae3d928099a3cfae5ca1622" exitCode=0 Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.963644 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6kcr" event={"ID":"ef4834a8-a534-49a1-ba4e-07543a1d73ff","Type":"ContainerDied","Data":"7f55f9c6228c0244d9e3c7e38d2569229b65bf9a7ae3d928099a3cfae5ca1622"} Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.963665 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6kcr" event={"ID":"ef4834a8-a534-49a1-ba4e-07543a1d73ff","Type":"ContainerStarted","Data":"49c5189fe2f9c09dc98e6dd9490ed2837b141ee31dfec16f46c3e6f0f0ff2d94"} Jan 29 16:59:41 crc kubenswrapper[4886]: I0129 16:59:41.995306 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" podStartSLOduration=2.465965501 podStartE2EDuration="6.995289115s" podCreationTimestamp="2026-01-29 16:59:35 +0000 UTC" firstStartedPulling="2026-01-29 16:59:36.21964987 +0000 UTC m=+2259.128369142" lastFinishedPulling="2026-01-29 16:59:40.748973484 +0000 UTC m=+2263.657692756" observedRunningTime="2026-01-29 16:59:41.993106334 +0000 UTC m=+2264.901825626" watchObservedRunningTime="2026-01-29 16:59:41.995289115 +0000 UTC m=+2264.904008397" Jan 29 16:59:42 crc kubenswrapper[4886]: E0129 16:59:42.955365 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef4834a8_a534_49a1_ba4e_07543a1d73ff.slice/crio-36216559b8eb83c21708f2fd9d52738d4492c30985da6e15593311023eaff4e2.scope\": RecentStats: unable to find data in memory cache]" Jan 29 16:59:43 crc kubenswrapper[4886]: I0129 16:59:43.980020 4886 generic.go:334] "Generic (PLEG): container finished" podID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerID="36216559b8eb83c21708f2fd9d52738d4492c30985da6e15593311023eaff4e2" exitCode=0 Jan 29 16:59:43 crc kubenswrapper[4886]: I0129 16:59:43.980111 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6kcr" event={"ID":"ef4834a8-a534-49a1-ba4e-07543a1d73ff","Type":"ContainerDied","Data":"36216559b8eb83c21708f2fd9d52738d4492c30985da6e15593311023eaff4e2"} Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.674297 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ftjt4"] Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.676642 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.679681 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ftjt4"] Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.745102 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-catalog-content\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.745191 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-utilities\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.745219 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9rhq\" (UniqueName: \"kubernetes.io/projected/15a7d478-4fe8-4737-87e0-092b2309852b-kube-api-access-l9rhq\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.846445 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-utilities\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.846502 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9rhq\" (UniqueName: \"kubernetes.io/projected/15a7d478-4fe8-4737-87e0-092b2309852b-kube-api-access-l9rhq\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.846602 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-catalog-content\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.847147 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-catalog-content\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.847146 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-utilities\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.866206 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9rhq\" (UniqueName: \"kubernetes.io/projected/15a7d478-4fe8-4737-87e0-092b2309852b-kube-api-access-l9rhq\") pod \"community-operators-ftjt4\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:44 crc kubenswrapper[4886]: I0129 16:59:44.989882 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6kcr" event={"ID":"ef4834a8-a534-49a1-ba4e-07543a1d73ff","Type":"ContainerStarted","Data":"fc506e5a1038e5ccbab48d49928b287cd8545ce7830f69a33adf11734bda8aaf"} Jan 29 16:59:45 crc kubenswrapper[4886]: I0129 16:59:45.002850 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:45 crc kubenswrapper[4886]: I0129 16:59:45.013267 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l6kcr" podStartSLOduration=2.505858937 podStartE2EDuration="5.013244437s" podCreationTimestamp="2026-01-29 16:59:40 +0000 UTC" firstStartedPulling="2026-01-29 16:59:41.96559099 +0000 UTC m=+2264.874310262" lastFinishedPulling="2026-01-29 16:59:44.47297649 +0000 UTC m=+2267.381695762" observedRunningTime="2026-01-29 16:59:45.009611664 +0000 UTC m=+2267.918330936" watchObservedRunningTime="2026-01-29 16:59:45.013244437 +0000 UTC m=+2267.921963709" Jan 29 16:59:45 crc kubenswrapper[4886]: I0129 16:59:45.538899 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ftjt4"] Jan 29 16:59:45 crc kubenswrapper[4886]: I0129 16:59:45.748365 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-86bf76f8cb-r9sbf" Jan 29 16:59:45 crc kubenswrapper[4886]: I0129 16:59:45.998583 4886 generic.go:334] "Generic (PLEG): container finished" podID="15a7d478-4fe8-4737-87e0-092b2309852b" containerID="958784e76577e7087aaa7c7d11f4f78ba2b156b2be0c93f2ecfe7b0844514e68" exitCode=0 Jan 29 16:59:45 crc kubenswrapper[4886]: I0129 16:59:45.998802 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftjt4" event={"ID":"15a7d478-4fe8-4737-87e0-092b2309852b","Type":"ContainerDied","Data":"958784e76577e7087aaa7c7d11f4f78ba2b156b2be0c93f2ecfe7b0844514e68"} Jan 29 16:59:45 crc kubenswrapper[4886]: I0129 16:59:45.998850 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftjt4" event={"ID":"15a7d478-4fe8-4737-87e0-092b2309852b","Type":"ContainerStarted","Data":"08d43d29ab5b2356ae9a1a801ed2dac107c26afe2209d1165714e6d9a8ed91ec"} Jan 29 16:59:47 crc kubenswrapper[4886]: I0129 16:59:47.006833 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftjt4" event={"ID":"15a7d478-4fe8-4737-87e0-092b2309852b","Type":"ContainerStarted","Data":"ab0631061378de0825d90277572d0835271d2870feb942a13be33aaadea313be"} Jan 29 16:59:48 crc kubenswrapper[4886]: I0129 16:59:48.017209 4886 generic.go:334] "Generic (PLEG): container finished" podID="15a7d478-4fe8-4737-87e0-092b2309852b" containerID="ab0631061378de0825d90277572d0835271d2870feb942a13be33aaadea313be" exitCode=0 Jan 29 16:59:48 crc kubenswrapper[4886]: I0129 16:59:48.017258 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftjt4" event={"ID":"15a7d478-4fe8-4737-87e0-092b2309852b","Type":"ContainerDied","Data":"ab0631061378de0825d90277572d0835271d2870feb942a13be33aaadea313be"} Jan 29 16:59:48 crc kubenswrapper[4886]: I0129 16:59:48.019267 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 16:59:51 crc kubenswrapper[4886]: I0129 16:59:51.194647 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:51 crc kubenswrapper[4886]: I0129 16:59:51.195030 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:51 crc kubenswrapper[4886]: I0129 16:59:51.245472 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:52 crc kubenswrapper[4886]: I0129 16:59:52.048770 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftjt4" event={"ID":"15a7d478-4fe8-4737-87e0-092b2309852b","Type":"ContainerStarted","Data":"178a4fc4a6ab2ef1c06ebd2a559deefd40e2d485747bf60722673762411e0255"} Jan 29 16:59:52 crc kubenswrapper[4886]: I0129 16:59:52.072819 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ftjt4" podStartSLOduration=3.251168691 podStartE2EDuration="8.072796351s" podCreationTimestamp="2026-01-29 16:59:44 +0000 UTC" firstStartedPulling="2026-01-29 16:59:45.999971881 +0000 UTC m=+2268.908691153" lastFinishedPulling="2026-01-29 16:59:50.821599551 +0000 UTC m=+2273.730318813" observedRunningTime="2026-01-29 16:59:52.066093532 +0000 UTC m=+2274.974812814" watchObservedRunningTime="2026-01-29 16:59:52.072796351 +0000 UTC m=+2274.981515623" Jan 29 16:59:52 crc kubenswrapper[4886]: I0129 16:59:52.094854 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:54 crc kubenswrapper[4886]: I0129 16:59:54.259529 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l6kcr"] Jan 29 16:59:54 crc kubenswrapper[4886]: I0129 16:59:54.260083 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l6kcr" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerName="registry-server" containerID="cri-o://fc506e5a1038e5ccbab48d49928b287cd8545ce7830f69a33adf11734bda8aaf" gracePeriod=2 Jan 29 16:59:54 crc kubenswrapper[4886]: E0129 16:59:54.928890 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef4834a8_a534_49a1_ba4e_07543a1d73ff.slice/crio-conmon-fc506e5a1038e5ccbab48d49928b287cd8545ce7830f69a33adf11734bda8aaf.scope\": RecentStats: unable to find data in memory cache]" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.003113 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.003177 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.062157 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ftjt4" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.085782 4886 generic.go:334] "Generic (PLEG): container finished" podID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerID="fc506e5a1038e5ccbab48d49928b287cd8545ce7830f69a33adf11734bda8aaf" exitCode=0 Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.086838 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6kcr" event={"ID":"ef4834a8-a534-49a1-ba4e-07543a1d73ff","Type":"ContainerDied","Data":"fc506e5a1038e5ccbab48d49928b287cd8545ce7830f69a33adf11734bda8aaf"} Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.222290 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.317898 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-utilities\") pod \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.317966 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp2tz\" (UniqueName: \"kubernetes.io/projected/ef4834a8-a534-49a1-ba4e-07543a1d73ff-kube-api-access-hp2tz\") pod \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.318101 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-catalog-content\") pod \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\" (UID: \"ef4834a8-a534-49a1-ba4e-07543a1d73ff\") " Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.319027 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-utilities" (OuterVolumeSpecName: "utilities") pod "ef4834a8-a534-49a1-ba4e-07543a1d73ff" (UID: "ef4834a8-a534-49a1-ba4e-07543a1d73ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.325965 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4834a8-a534-49a1-ba4e-07543a1d73ff-kube-api-access-hp2tz" (OuterVolumeSpecName: "kube-api-access-hp2tz") pod "ef4834a8-a534-49a1-ba4e-07543a1d73ff" (UID: "ef4834a8-a534-49a1-ba4e-07543a1d73ff"). InnerVolumeSpecName "kube-api-access-hp2tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.420102 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.420141 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp2tz\" (UniqueName: \"kubernetes.io/projected/ef4834a8-a534-49a1-ba4e-07543a1d73ff-kube-api-access-hp2tz\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.467441 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef4834a8-a534-49a1-ba4e-07543a1d73ff" (UID: "ef4834a8-a534-49a1-ba4e-07543a1d73ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 16:59:55 crc kubenswrapper[4886]: I0129 16:59:55.521208 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4834a8-a534-49a1-ba4e-07543a1d73ff-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 16:59:56 crc kubenswrapper[4886]: I0129 16:59:56.095908 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6kcr" event={"ID":"ef4834a8-a534-49a1-ba4e-07543a1d73ff","Type":"ContainerDied","Data":"49c5189fe2f9c09dc98e6dd9490ed2837b141ee31dfec16f46c3e6f0f0ff2d94"} Jan 29 16:59:56 crc kubenswrapper[4886]: I0129 16:59:56.095974 4886 scope.go:117] "RemoveContainer" containerID="fc506e5a1038e5ccbab48d49928b287cd8545ce7830f69a33adf11734bda8aaf" Jan 29 16:59:56 crc kubenswrapper[4886]: I0129 16:59:56.095998 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6kcr" Jan 29 16:59:56 crc kubenswrapper[4886]: I0129 16:59:56.119929 4886 scope.go:117] "RemoveContainer" containerID="36216559b8eb83c21708f2fd9d52738d4492c30985da6e15593311023eaff4e2" Jan 29 16:59:56 crc kubenswrapper[4886]: I0129 16:59:56.128271 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l6kcr"] Jan 29 16:59:56 crc kubenswrapper[4886]: I0129 16:59:56.135544 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l6kcr"] Jan 29 16:59:56 crc kubenswrapper[4886]: I0129 16:59:56.160713 4886 scope.go:117] "RemoveContainer" containerID="7f55f9c6228c0244d9e3c7e38d2569229b65bf9a7ae3d928099a3cfae5ca1622" Jan 29 16:59:56 crc kubenswrapper[4886]: I0129 16:59:56.625377 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" path="/var/lib/kubelet/pods/ef4834a8-a534-49a1-ba4e-07543a1d73ff/volumes" Jan 29 16:59:59 crc kubenswrapper[4886]: I0129 16:59:59.660762 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 16:59:59 crc kubenswrapper[4886]: I0129 16:59:59.661403 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.134669 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666"] Jan 29 17:00:00 crc kubenswrapper[4886]: E0129 17:00:00.136558 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerName="registry-server" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.136582 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerName="registry-server" Jan 29 17:00:00 crc kubenswrapper[4886]: E0129 17:00:00.136604 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerName="extract-utilities" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.136612 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerName="extract-utilities" Jan 29 17:00:00 crc kubenswrapper[4886]: E0129 17:00:00.136627 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerName="extract-content" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.136634 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerName="extract-content" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.136812 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4834a8-a534-49a1-ba4e-07543a1d73ff" containerName="registry-server" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.137380 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.139045 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.139390 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.153318 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666"] Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.200934 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcxn\" (UniqueName: \"kubernetes.io/projected/3da2d212-de01-458b-9805-8eb21ed83324-kube-api-access-4wcxn\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.201188 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3da2d212-de01-458b-9805-8eb21ed83324-secret-volume\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.201590 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3da2d212-de01-458b-9805-8eb21ed83324-config-volume\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.303314 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3da2d212-de01-458b-9805-8eb21ed83324-config-volume\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.303398 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcxn\" (UniqueName: \"kubernetes.io/projected/3da2d212-de01-458b-9805-8eb21ed83324-kube-api-access-4wcxn\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.303450 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3da2d212-de01-458b-9805-8eb21ed83324-secret-volume\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.304523 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3da2d212-de01-458b-9805-8eb21ed83324-config-volume\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.316879 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3da2d212-de01-458b-9805-8eb21ed83324-secret-volume\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.320781 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcxn\" (UniqueName: \"kubernetes.io/projected/3da2d212-de01-458b-9805-8eb21ed83324-kube-api-access-4wcxn\") pod \"collect-profiles-29495100-wk666\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.453250 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:00 crc kubenswrapper[4886]: I0129 17:00:00.875639 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666"] Jan 29 17:00:00 crc kubenswrapper[4886]: W0129 17:00:00.888007 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da2d212_de01_458b_9805_8eb21ed83324.slice/crio-948a7f05186c9ca396055303ccf91cac221f2785d4fcde2bf60418d979c118d9 WatchSource:0}: Error finding container 948a7f05186c9ca396055303ccf91cac221f2785d4fcde2bf60418d979c118d9: Status 404 returned error can't find the container with id 948a7f05186c9ca396055303ccf91cac221f2785d4fcde2bf60418d979c118d9 Jan 29 17:00:01 crc kubenswrapper[4886]: I0129 17:00:01.146818 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" event={"ID":"3da2d212-de01-458b-9805-8eb21ed83324","Type":"ContainerStarted","Data":"3f2a5d53f1118cb99d6ac0f75863b8e8419b33babb29267642e06437ed3d61f8"} Jan 29 17:00:01 crc kubenswrapper[4886]: I0129 17:00:01.147162 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" event={"ID":"3da2d212-de01-458b-9805-8eb21ed83324","Type":"ContainerStarted","Data":"948a7f05186c9ca396055303ccf91cac221f2785d4fcde2bf60418d979c118d9"} Jan 29 17:00:02 crc kubenswrapper[4886]: I0129 17:00:02.156603 4886 generic.go:334] "Generic (PLEG): container finished" podID="3da2d212-de01-458b-9805-8eb21ed83324" containerID="3f2a5d53f1118cb99d6ac0f75863b8e8419b33babb29267642e06437ed3d61f8" exitCode=0 Jan 29 17:00:02 crc kubenswrapper[4886]: I0129 17:00:02.156687 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" event={"ID":"3da2d212-de01-458b-9805-8eb21ed83324","Type":"ContainerDied","Data":"3f2a5d53f1118cb99d6ac0f75863b8e8419b33babb29267642e06437ed3d61f8"} Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.515948 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.567969 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3da2d212-de01-458b-9805-8eb21ed83324-config-volume\") pod \"3da2d212-de01-458b-9805-8eb21ed83324\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.568166 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wcxn\" (UniqueName: \"kubernetes.io/projected/3da2d212-de01-458b-9805-8eb21ed83324-kube-api-access-4wcxn\") pod \"3da2d212-de01-458b-9805-8eb21ed83324\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.568379 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3da2d212-de01-458b-9805-8eb21ed83324-secret-volume\") pod \"3da2d212-de01-458b-9805-8eb21ed83324\" (UID: \"3da2d212-de01-458b-9805-8eb21ed83324\") " Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.569645 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da2d212-de01-458b-9805-8eb21ed83324-config-volume" (OuterVolumeSpecName: "config-volume") pod "3da2d212-de01-458b-9805-8eb21ed83324" (UID: "3da2d212-de01-458b-9805-8eb21ed83324"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.573874 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da2d212-de01-458b-9805-8eb21ed83324-kube-api-access-4wcxn" (OuterVolumeSpecName: "kube-api-access-4wcxn") pod "3da2d212-de01-458b-9805-8eb21ed83324" (UID: "3da2d212-de01-458b-9805-8eb21ed83324"). InnerVolumeSpecName "kube-api-access-4wcxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.575284 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da2d212-de01-458b-9805-8eb21ed83324-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3da2d212-de01-458b-9805-8eb21ed83324" (UID: "3da2d212-de01-458b-9805-8eb21ed83324"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.672188 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wcxn\" (UniqueName: \"kubernetes.io/projected/3da2d212-de01-458b-9805-8eb21ed83324-kube-api-access-4wcxn\") on node \"crc\" DevicePath \"\"" Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.672251 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3da2d212-de01-458b-9805-8eb21ed83324-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 17:00:03 crc kubenswrapper[4886]: I0129 17:00:03.672277 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3da2d212-de01-458b-9805-8eb21ed83324-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 17:00:04 crc kubenswrapper[4886]: I0129 17:00:04.187901 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" event={"ID":"3da2d212-de01-458b-9805-8eb21ed83324","Type":"ContainerDied","Data":"948a7f05186c9ca396055303ccf91cac221f2785d4fcde2bf60418d979c118d9"} Jan 29 17:00:04 crc kubenswrapper[4886]: I0129 17:00:04.188250 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="948a7f05186c9ca396055303ccf91cac221f2785d4fcde2bf60418d979c118d9" Jan 29 17:00:04 crc kubenswrapper[4886]: I0129 17:00:04.187992 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666" Jan 29 17:00:04 crc kubenswrapper[4886]: I0129 17:00:04.582479 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495055-bkqmf"] Jan 29 17:00:04 crc kubenswrapper[4886]: I0129 17:00:04.599435 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495055-bkqmf"] Jan 29 17:00:04 crc kubenswrapper[4886]: I0129 17:00:04.626532 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a20685-8c41-4c3b-9b91-fe1e05cf5fe9" path="/var/lib/kubelet/pods/a7a20685-8c41-4c3b-9b91-fe1e05cf5fe9/volumes" Jan 29 17:00:05 crc kubenswrapper[4886]: I0129 17:00:05.073109 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ftjt4" Jan 29 17:00:05 crc kubenswrapper[4886]: I0129 17:00:05.136082 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ftjt4"] Jan 29 17:00:05 crc kubenswrapper[4886]: I0129 17:00:05.196487 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ftjt4" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" containerName="registry-server" containerID="cri-o://178a4fc4a6ab2ef1c06ebd2a559deefd40e2d485747bf60722673762411e0255" gracePeriod=2 Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.209990 4886 generic.go:334] "Generic (PLEG): container finished" podID="15a7d478-4fe8-4737-87e0-092b2309852b" containerID="178a4fc4a6ab2ef1c06ebd2a559deefd40e2d485747bf60722673762411e0255" exitCode=0 Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.210081 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftjt4" event={"ID":"15a7d478-4fe8-4737-87e0-092b2309852b","Type":"ContainerDied","Data":"178a4fc4a6ab2ef1c06ebd2a559deefd40e2d485747bf60722673762411e0255"} Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.339796 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftjt4" Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.423170 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9rhq\" (UniqueName: \"kubernetes.io/projected/15a7d478-4fe8-4737-87e0-092b2309852b-kube-api-access-l9rhq\") pod \"15a7d478-4fe8-4737-87e0-092b2309852b\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.423248 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-catalog-content\") pod \"15a7d478-4fe8-4737-87e0-092b2309852b\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.423299 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-utilities\") pod \"15a7d478-4fe8-4737-87e0-092b2309852b\" (UID: \"15a7d478-4fe8-4737-87e0-092b2309852b\") " Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.424974 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-utilities" (OuterVolumeSpecName: "utilities") pod "15a7d478-4fe8-4737-87e0-092b2309852b" (UID: "15a7d478-4fe8-4737-87e0-092b2309852b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.429592 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a7d478-4fe8-4737-87e0-092b2309852b-kube-api-access-l9rhq" (OuterVolumeSpecName: "kube-api-access-l9rhq") pod "15a7d478-4fe8-4737-87e0-092b2309852b" (UID: "15a7d478-4fe8-4737-87e0-092b2309852b"). InnerVolumeSpecName "kube-api-access-l9rhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.490053 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15a7d478-4fe8-4737-87e0-092b2309852b" (UID: "15a7d478-4fe8-4737-87e0-092b2309852b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.525511 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9rhq\" (UniqueName: \"kubernetes.io/projected/15a7d478-4fe8-4737-87e0-092b2309852b-kube-api-access-l9rhq\") on node \"crc\" DevicePath \"\"" Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.525540 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:00:06 crc kubenswrapper[4886]: I0129 17:00:06.525551 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15a7d478-4fe8-4737-87e0-092b2309852b-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:00:07 crc kubenswrapper[4886]: I0129 17:00:07.221812 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftjt4" event={"ID":"15a7d478-4fe8-4737-87e0-092b2309852b","Type":"ContainerDied","Data":"08d43d29ab5b2356ae9a1a801ed2dac107c26afe2209d1165714e6d9a8ed91ec"} Jan 29 17:00:07 crc kubenswrapper[4886]: I0129 17:00:07.222175 4886 scope.go:117] "RemoveContainer" containerID="178a4fc4a6ab2ef1c06ebd2a559deefd40e2d485747bf60722673762411e0255" Jan 29 17:00:07 crc kubenswrapper[4886]: I0129 17:00:07.221927 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftjt4" Jan 29 17:00:07 crc kubenswrapper[4886]: I0129 17:00:07.249276 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ftjt4"] Jan 29 17:00:07 crc kubenswrapper[4886]: I0129 17:00:07.251087 4886 scope.go:117] "RemoveContainer" containerID="ab0631061378de0825d90277572d0835271d2870feb942a13be33aaadea313be" Jan 29 17:00:07 crc kubenswrapper[4886]: I0129 17:00:07.270496 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ftjt4"] Jan 29 17:00:07 crc kubenswrapper[4886]: I0129 17:00:07.280591 4886 scope.go:117] "RemoveContainer" containerID="958784e76577e7087aaa7c7d11f4f78ba2b156b2be0c93f2ecfe7b0844514e68" Jan 29 17:00:08 crc kubenswrapper[4886]: I0129 17:00:08.625523 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" path="/var/lib/kubelet/pods/15a7d478-4fe8-4737-87e0-092b2309852b/volumes" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.936725 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz"] Jan 29 17:00:21 crc kubenswrapper[4886]: E0129 17:00:21.937620 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da2d212-de01-458b-9805-8eb21ed83324" containerName="collect-profiles" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.937637 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da2d212-de01-458b-9805-8eb21ed83324" containerName="collect-profiles" Jan 29 17:00:21 crc kubenswrapper[4886]: E0129 17:00:21.937666 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" containerName="extract-content" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.937673 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" containerName="extract-content" Jan 29 17:00:21 crc kubenswrapper[4886]: E0129 17:00:21.937689 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" containerName="registry-server" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.937695 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" containerName="registry-server" Jan 29 17:00:21 crc kubenswrapper[4886]: E0129 17:00:21.937711 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" containerName="extract-utilities" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.937717 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" containerName="extract-utilities" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.937859 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a7d478-4fe8-4737-87e0-092b2309852b" containerName="registry-server" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.937880 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da2d212-de01-458b-9805-8eb21ed83324" containerName="collect-profiles" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.938395 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6"] Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.938967 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.939206 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.941384 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-gnsp5" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.941439 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-br6j7" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.956763 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6"] Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.992831 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244l4\" (UniqueName: \"kubernetes.io/projected/4e16e340-e213-492a-9c93-851df7b1bddb-kube-api-access-244l4\") pod \"cinder-operator-controller-manager-8d874c8fc-w6qc6\" (UID: \"4e16e340-e213-492a-9c93-851df7b1bddb\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" Jan 29 17:00:21 crc kubenswrapper[4886]: I0129 17:00:21.992940 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjlpn\" (UniqueName: \"kubernetes.io/projected/3ffc5e8b-7f7a-4585-b43d-07e2589493c9-kube-api-access-mjlpn\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-2g2cz\" (UID: \"3ffc5e8b-7f7a-4585-b43d-07e2589493c9\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.008221 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.074592 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.075761 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.078346 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xgnw7" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.085263 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.086392 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.090786 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hwqr9" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.095376 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjlpn\" (UniqueName: \"kubernetes.io/projected/3ffc5e8b-7f7a-4585-b43d-07e2589493c9-kube-api-access-mjlpn\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-2g2cz\" (UID: \"3ffc5e8b-7f7a-4585-b43d-07e2589493c9\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.095464 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244l4\" (UniqueName: \"kubernetes.io/projected/4e16e340-e213-492a-9c93-851df7b1bddb-kube-api-access-244l4\") pod \"cinder-operator-controller-manager-8d874c8fc-w6qc6\" (UID: \"4e16e340-e213-492a-9c93-851df7b1bddb\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.095511 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rxpp\" (UniqueName: \"kubernetes.io/projected/d01e417c-a1b0-445d-83eb-f3c21a492138-kube-api-access-5rxpp\") pod \"designate-operator-controller-manager-6d9697b7f4-rhxnz\" (UID: \"d01e417c-a1b0-445d-83eb-f3c21a492138\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.098697 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.107824 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.109025 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.123421 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.124490 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.125135 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-h9dkd" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.130772 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-hkrqg" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.139189 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.154007 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.158815 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjlpn\" (UniqueName: \"kubernetes.io/projected/3ffc5e8b-7f7a-4585-b43d-07e2589493c9-kube-api-access-mjlpn\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-2g2cz\" (UID: \"3ffc5e8b-7f7a-4585-b43d-07e2589493c9\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.166842 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244l4\" (UniqueName: \"kubernetes.io/projected/4e16e340-e213-492a-9c93-851df7b1bddb-kube-api-access-244l4\") pod \"cinder-operator-controller-manager-8d874c8fc-w6qc6\" (UID: \"4e16e340-e213-492a-9c93-851df7b1bddb\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.169291 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-t5n28"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.170381 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.176736 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-94czq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.176937 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.188392 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.193969 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-t5n28"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.200439 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgpxk\" (UniqueName: \"kubernetes.io/projected/f2898e34-e423-4576-a765-3919510dcd85-kube-api-access-jgpxk\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.200500 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2vc\" (UniqueName: \"kubernetes.io/projected/81b8c703-d895-41ce-8ca3-99fd6b6eecb6-kube-api-access-4t2vc\") pod \"horizon-operator-controller-manager-5fb775575f-4mmm8\" (UID: \"81b8c703-d895-41ce-8ca3-99fd6b6eecb6\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.200526 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rxpp\" (UniqueName: \"kubernetes.io/projected/d01e417c-a1b0-445d-83eb-f3c21a492138-kube-api-access-5rxpp\") pod \"designate-operator-controller-manager-6d9697b7f4-rhxnz\" (UID: \"d01e417c-a1b0-445d-83eb-f3c21a492138\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.200553 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wckhl\" (UniqueName: \"kubernetes.io/projected/02decfa9-69fb-46b5-8b30-30954e39d411-kube-api-access-wckhl\") pod \"glance-operator-controller-manager-8886f4c47-pfw9c\" (UID: \"02decfa9-69fb-46b5-8b30-30954e39d411\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.200590 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.200610 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59v5v\" (UniqueName: \"kubernetes.io/projected/3c56c53e-a292-4e75-b069-c1d06ceeb6c5-kube-api-access-59v5v\") pod \"heat-operator-controller-manager-69d6db494d-qf2xg\" (UID: \"3c56c53e-a292-4e75-b069-c1d06ceeb6c5\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.200913 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.201928 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.205680 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-sf2sl" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.208634 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.209551 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.213672 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-bp7xc" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.216210 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.217262 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.220155 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nk9m2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.228903 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.240794 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rxpp\" (UniqueName: \"kubernetes.io/projected/d01e417c-a1b0-445d-83eb-f3c21a492138-kube-api-access-5rxpp\") pod \"designate-operator-controller-manager-6d9697b7f4-rhxnz\" (UID: \"d01e417c-a1b0-445d-83eb-f3c21a492138\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.249363 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.270580 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.277238 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.277743 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.278929 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.283999 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wxpgb" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.294602 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.295818 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303343 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wckhl\" (UniqueName: \"kubernetes.io/projected/02decfa9-69fb-46b5-8b30-30954e39d411-kube-api-access-wckhl\") pod \"glance-operator-controller-manager-8886f4c47-pfw9c\" (UID: \"02decfa9-69fb-46b5-8b30-30954e39d411\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303400 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303424 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59v5v\" (UniqueName: \"kubernetes.io/projected/3c56c53e-a292-4e75-b069-c1d06ceeb6c5-kube-api-access-59v5v\") pod \"heat-operator-controller-manager-69d6db494d-qf2xg\" (UID: \"3c56c53e-a292-4e75-b069-c1d06ceeb6c5\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303452 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5nj\" (UniqueName: \"kubernetes.io/projected/4c2d29a3-d017-4e76-9a82-02943a6b38bf-kube-api-access-pw5nj\") pod \"mariadb-operator-controller-manager-67bf948998-c4j5s\" (UID: \"4c2d29a3-d017-4e76-9a82-02943a6b38bf\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303510 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgpxk\" (UniqueName: \"kubernetes.io/projected/f2898e34-e423-4576-a765-3919510dcd85-kube-api-access-jgpxk\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303534 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdh8t\" (UniqueName: \"kubernetes.io/projected/70336809-8231-4ed9-a912-8b668aaa53bb-kube-api-access-bdh8t\") pod \"manila-operator-controller-manager-7dd968899f-zpgq2\" (UID: \"70336809-8231-4ed9-a912-8b668aaa53bb\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303579 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2w54\" (UniqueName: \"kubernetes.io/projected/10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0-kube-api-access-j2w54\") pod \"ironic-operator-controller-manager-5f4b8bd54d-77z62\" (UID: \"10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303604 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5skg\" (UniqueName: \"kubernetes.io/projected/67107e9f-cf09-4d35-af26-c77f4d76083a-kube-api-access-h5skg\") pod \"keystone-operator-controller-manager-84f48565d4-kwr4n\" (UID: \"67107e9f-cf09-4d35-af26-c77f4d76083a\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.303624 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2vc\" (UniqueName: \"kubernetes.io/projected/81b8c703-d895-41ce-8ca3-99fd6b6eecb6-kube-api-access-4t2vc\") pod \"horizon-operator-controller-manager-5fb775575f-4mmm8\" (UID: \"81b8c703-d895-41ce-8ca3-99fd6b6eecb6\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" Jan 29 17:00:22 crc kubenswrapper[4886]: E0129 17:00:22.304004 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:22 crc kubenswrapper[4886]: E0129 17:00:22.304041 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert podName:f2898e34-e423-4576-a765-3919510dcd85 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:22.804027197 +0000 UTC m=+2305.712746469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert") pod "infra-operator-controller-manager-79955696d6-t5n28" (UID: "f2898e34-e423-4576-a765-3919510dcd85") : secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.318399 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.319392 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.322621 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mvzxw" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.331239 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgpxk\" (UniqueName: \"kubernetes.io/projected/f2898e34-e423-4576-a765-3919510dcd85-kube-api-access-jgpxk\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.342007 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59v5v\" (UniqueName: \"kubernetes.io/projected/3c56c53e-a292-4e75-b069-c1d06ceeb6c5-kube-api-access-59v5v\") pod \"heat-operator-controller-manager-69d6db494d-qf2xg\" (UID: \"3c56c53e-a292-4e75-b069-c1d06ceeb6c5\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.347189 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2vc\" (UniqueName: \"kubernetes.io/projected/81b8c703-d895-41ce-8ca3-99fd6b6eecb6-kube-api-access-4t2vc\") pod \"horizon-operator-controller-manager-5fb775575f-4mmm8\" (UID: \"81b8c703-d895-41ce-8ca3-99fd6b6eecb6\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.352666 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.354442 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.354546 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wckhl\" (UniqueName: \"kubernetes.io/projected/02decfa9-69fb-46b5-8b30-30954e39d411-kube-api-access-wckhl\") pod \"glance-operator-controller-manager-8886f4c47-pfw9c\" (UID: \"02decfa9-69fb-46b5-8b30-30954e39d411\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.359440 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gml7r" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.364847 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.394121 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.403623 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.405025 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.408127 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5nj\" (UniqueName: \"kubernetes.io/projected/4c2d29a3-d017-4e76-9a82-02943a6b38bf-kube-api-access-pw5nj\") pod \"mariadb-operator-controller-manager-67bf948998-c4j5s\" (UID: \"4c2d29a3-d017-4e76-9a82-02943a6b38bf\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.408290 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6xj5\" (UniqueName: \"kubernetes.io/projected/053a2790-370f-44bd-a2c0-603ffb22ed3c-kube-api-access-z6xj5\") pod \"neutron-operator-controller-manager-585dbc889-9zqmc\" (UID: \"053a2790-370f-44bd-a2c0-603ffb22ed3c\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.408384 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdh8t\" (UniqueName: \"kubernetes.io/projected/70336809-8231-4ed9-a912-8b668aaa53bb-kube-api-access-bdh8t\") pod \"manila-operator-controller-manager-7dd968899f-zpgq2\" (UID: \"70336809-8231-4ed9-a912-8b668aaa53bb\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.408421 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvgcm\" (UniqueName: \"kubernetes.io/projected/c3cbde0f-6b5d-47cf-93e6-3d2e12051aba-kube-api-access-tvgcm\") pod \"nova-operator-controller-manager-55bff696bd-dxcgn\" (UID: \"c3cbde0f-6b5d-47cf-93e6-3d2e12051aba\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.408487 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2w54\" (UniqueName: \"kubernetes.io/projected/10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0-kube-api-access-j2w54\") pod \"ironic-operator-controller-manager-5f4b8bd54d-77z62\" (UID: \"10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.408518 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5skg\" (UniqueName: \"kubernetes.io/projected/67107e9f-cf09-4d35-af26-c77f4d76083a-kube-api-access-h5skg\") pod \"keystone-operator-controller-manager-84f48565d4-kwr4n\" (UID: \"67107e9f-cf09-4d35-af26-c77f4d76083a\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.410892 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wv9wk" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.428889 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.433936 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2w54\" (UniqueName: \"kubernetes.io/projected/10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0-kube-api-access-j2w54\") pod \"ironic-operator-controller-manager-5f4b8bd54d-77z62\" (UID: \"10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.434968 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5nj\" (UniqueName: \"kubernetes.io/projected/4c2d29a3-d017-4e76-9a82-02943a6b38bf-kube-api-access-pw5nj\") pod \"mariadb-operator-controller-manager-67bf948998-c4j5s\" (UID: \"4c2d29a3-d017-4e76-9a82-02943a6b38bf\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.435052 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.436301 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.439458 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.439795 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-9xgxp" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.443808 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdh8t\" (UniqueName: \"kubernetes.io/projected/70336809-8231-4ed9-a912-8b668aaa53bb-kube-api-access-bdh8t\") pod \"manila-operator-controller-manager-7dd968899f-zpgq2\" (UID: \"70336809-8231-4ed9-a912-8b668aaa53bb\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.454249 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5skg\" (UniqueName: \"kubernetes.io/projected/67107e9f-cf09-4d35-af26-c77f4d76083a-kube-api-access-h5skg\") pod \"keystone-operator-controller-manager-84f48565d4-kwr4n\" (UID: \"67107e9f-cf09-4d35-af26-c77f4d76083a\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.474905 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.487908 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.489959 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.496812 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-9j7mb" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.503605 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.504866 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.522674 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.522735 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzzl\" (UniqueName: \"kubernetes.io/projected/7b52b050-b925-4562-8682-693917b7899c-kube-api-access-lmzzl\") pod \"octavia-operator-controller-manager-6687f8d877-8gq2g\" (UID: \"7b52b050-b925-4562-8682-693917b7899c\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.522805 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmzgb\" (UniqueName: \"kubernetes.io/projected/c2b6285c-ada4-43f6-8716-53b2afa13723-kube-api-access-nmzgb\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.522854 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6xj5\" (UniqueName: \"kubernetes.io/projected/053a2790-370f-44bd-a2c0-603ffb22ed3c-kube-api-access-z6xj5\") pod \"neutron-operator-controller-manager-585dbc889-9zqmc\" (UID: \"053a2790-370f-44bd-a2c0-603ffb22ed3c\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.522892 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvgcm\" (UniqueName: \"kubernetes.io/projected/c3cbde0f-6b5d-47cf-93e6-3d2e12051aba-kube-api-access-tvgcm\") pod \"nova-operator-controller-manager-55bff696bd-dxcgn\" (UID: \"c3cbde0f-6b5d-47cf-93e6-3d2e12051aba\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.538840 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.554138 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6xj5\" (UniqueName: \"kubernetes.io/projected/053a2790-370f-44bd-a2c0-603ffb22ed3c-kube-api-access-z6xj5\") pod \"neutron-operator-controller-manager-585dbc889-9zqmc\" (UID: \"053a2790-370f-44bd-a2c0-603ffb22ed3c\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.560863 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.575939 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvgcm\" (UniqueName: \"kubernetes.io/projected/c3cbde0f-6b5d-47cf-93e6-3d2e12051aba-kube-api-access-tvgcm\") pod \"nova-operator-controller-manager-55bff696bd-dxcgn\" (UID: \"c3cbde0f-6b5d-47cf-93e6-3d2e12051aba\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.603048 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.625430 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.625477 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzzl\" (UniqueName: \"kubernetes.io/projected/7b52b050-b925-4562-8682-693917b7899c-kube-api-access-lmzzl\") pod \"octavia-operator-controller-manager-6687f8d877-8gq2g\" (UID: \"7b52b050-b925-4562-8682-693917b7899c\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.625528 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hvpx\" (UniqueName: \"kubernetes.io/projected/14d9257b-94ae-4b29-b45a-403e034535d3-kube-api-access-4hvpx\") pod \"ovn-operator-controller-manager-788c46999f-xnccq\" (UID: \"14d9257b-94ae-4b29-b45a-403e034535d3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.625583 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmzgb\" (UniqueName: \"kubernetes.io/projected/c2b6285c-ada4-43f6-8716-53b2afa13723-kube-api-access-nmzgb\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:22 crc kubenswrapper[4886]: E0129 17:00:22.627068 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:22 crc kubenswrapper[4886]: E0129 17:00:22.627115 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert podName:c2b6285c-ada4-43f6-8716-53b2afa13723 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:23.127099598 +0000 UTC m=+2306.035818870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" (UID: "c2b6285c-ada4-43f6-8716-53b2afa13723") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.652074 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.673424 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmzgb\" (UniqueName: \"kubernetes.io/projected/c2b6285c-ada4-43f6-8716-53b2afa13723-kube-api-access-nmzgb\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.673448 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.698207 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.698548 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.706453 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.736244 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hvpx\" (UniqueName: \"kubernetes.io/projected/14d9257b-94ae-4b29-b45a-403e034535d3-kube-api-access-4hvpx\") pod \"ovn-operator-controller-manager-788c46999f-xnccq\" (UID: \"14d9257b-94ae-4b29-b45a-403e034535d3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.737966 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzzl\" (UniqueName: \"kubernetes.io/projected/7b52b050-b925-4562-8682-693917b7899c-kube-api-access-lmzzl\") pod \"octavia-operator-controller-manager-6687f8d877-8gq2g\" (UID: \"7b52b050-b925-4562-8682-693917b7899c\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.738184 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.745697 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.747195 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.750411 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-dhtns" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.752507 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.757856 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.777270 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jxfvf" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.790740 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.794984 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.796221 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hvpx\" (UniqueName: \"kubernetes.io/projected/14d9257b-94ae-4b29-b45a-403e034535d3-kube-api-access-4hvpx\") pod \"ovn-operator-controller-manager-788c46999f-xnccq\" (UID: \"14d9257b-94ae-4b29-b45a-403e034535d3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.827962 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.839681 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8fvq\" (UniqueName: \"kubernetes.io/projected/53042ed9-d676-4bb4-bf7b-9b3520aafd12-kube-api-access-s8fvq\") pod \"placement-operator-controller-manager-5b964cf4cd-xt9wq\" (UID: \"53042ed9-d676-4bb4-bf7b-9b3520aafd12\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.839741 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.839796 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpd7b\" (UniqueName: \"kubernetes.io/projected/608c459b-5b47-478a-9e3a-d83d935ae7c7-kube-api-access-tpd7b\") pod \"swift-operator-controller-manager-68fc8c869-cmfj2\" (UID: \"608c459b-5b47-478a-9e3a-d83d935ae7c7\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" Jan 29 17:00:22 crc kubenswrapper[4886]: E0129 17:00:22.840452 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:22 crc kubenswrapper[4886]: E0129 17:00:22.840497 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert podName:f2898e34-e423-4576-a765-3919510dcd85 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:23.840482126 +0000 UTC m=+2306.749201398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert") pod "infra-operator-controller-manager-79955696d6-t5n28" (UID: "f2898e34-e423-4576-a765-3919510dcd85") : secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.850403 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.852180 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.854423 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dst5g" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.865947 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.899009 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.900017 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.906363 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mz9qx" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.921583 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.924020 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.942046 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-xnrxl"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.942879 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5c9r\" (UniqueName: \"kubernetes.io/projected/cbfeb105-c5ee-408e-aac9-e4128e58f0e3-kube-api-access-p5c9r\") pod \"test-operator-controller-manager-56f8bfcd9f-hf95f\" (UID: \"cbfeb105-c5ee-408e-aac9-e4128e58f0e3\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.942912 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcgmv\" (UniqueName: \"kubernetes.io/projected/7db85474-4c59-4db6-ab4a-51092ebd5c62-kube-api-access-wcgmv\") pod \"telemetry-operator-controller-manager-75495fd598-2hpj4\" (UID: \"7db85474-4c59-4db6-ab4a-51092ebd5c62\") " pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.942954 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8fvq\" (UniqueName: \"kubernetes.io/projected/53042ed9-d676-4bb4-bf7b-9b3520aafd12-kube-api-access-s8fvq\") pod \"placement-operator-controller-manager-5b964cf4cd-xt9wq\" (UID: \"53042ed9-d676-4bb4-bf7b-9b3520aafd12\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.943021 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpd7b\" (UniqueName: \"kubernetes.io/projected/608c459b-5b47-478a-9e3a-d83d935ae7c7-kube-api-access-tpd7b\") pod \"swift-operator-controller-manager-68fc8c869-cmfj2\" (UID: \"608c459b-5b47-478a-9e3a-d83d935ae7c7\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.943442 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.949931 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ztslw" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.951131 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-xnrxl"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.976492 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.976936 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpd7b\" (UniqueName: \"kubernetes.io/projected/608c459b-5b47-478a-9e3a-d83d935ae7c7-kube-api-access-tpd7b\") pod \"swift-operator-controller-manager-68fc8c869-cmfj2\" (UID: \"608c459b-5b47-478a-9e3a-d83d935ae7c7\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.985194 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.985814 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4"] Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.989923 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.991581 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-q9gr7" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.991676 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.996982 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 29 17:00:22 crc kubenswrapper[4886]: I0129 17:00:22.998508 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8fvq\" (UniqueName: \"kubernetes.io/projected/53042ed9-d676-4bb4-bf7b-9b3520aafd12-kube-api-access-s8fvq\") pod \"placement-operator-controller-manager-5b964cf4cd-xt9wq\" (UID: \"53042ed9-d676-4bb4-bf7b-9b3520aafd12\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.000567 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9"] Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.001988 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.003892 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kpzmg" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.013419 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9"] Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.045007 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k6m8\" (UniqueName: \"kubernetes.io/projected/037bf2ff-dd50-4d62-a525-5304c088cbc0-kube-api-access-5k6m8\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.045423 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kll8v\" (UniqueName: \"kubernetes.io/projected/165231a4-c627-484b-9aab-b4ce3feafe7e-kube-api-access-kll8v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ffdr9\" (UID: \"165231a4-c627-484b-9aab-b4ce3feafe7e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.046553 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.046731 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5c9r\" (UniqueName: \"kubernetes.io/projected/cbfeb105-c5ee-408e-aac9-e4128e58f0e3-kube-api-access-p5c9r\") pod \"test-operator-controller-manager-56f8bfcd9f-hf95f\" (UID: \"cbfeb105-c5ee-408e-aac9-e4128e58f0e3\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.046806 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcgmv\" (UniqueName: \"kubernetes.io/projected/7db85474-4c59-4db6-ab4a-51092ebd5c62-kube-api-access-wcgmv\") pod \"telemetry-operator-controller-manager-75495fd598-2hpj4\" (UID: \"7db85474-4c59-4db6-ab4a-51092ebd5c62\") " pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.046906 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv82w\" (UniqueName: \"kubernetes.io/projected/6a145dac-4d02-493c-9bd8-2f9652fcb1d1-kube-api-access-kv82w\") pod \"watcher-operator-controller-manager-564965969-xnrxl\" (UID: \"6a145dac-4d02-493c-9bd8-2f9652fcb1d1\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.047003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.087696 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcgmv\" (UniqueName: \"kubernetes.io/projected/7db85474-4c59-4db6-ab4a-51092ebd5c62-kube-api-access-wcgmv\") pod \"telemetry-operator-controller-manager-75495fd598-2hpj4\" (UID: \"7db85474-4c59-4db6-ab4a-51092ebd5c62\") " pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.088289 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5c9r\" (UniqueName: \"kubernetes.io/projected/cbfeb105-c5ee-408e-aac9-e4128e58f0e3-kube-api-access-p5c9r\") pod \"test-operator-controller-manager-56f8bfcd9f-hf95f\" (UID: \"cbfeb105-c5ee-408e-aac9-e4128e58f0e3\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.148173 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kll8v\" (UniqueName: \"kubernetes.io/projected/165231a4-c627-484b-9aab-b4ce3feafe7e-kube-api-access-kll8v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ffdr9\" (UID: \"165231a4-c627-484b-9aab-b4ce3feafe7e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.148208 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.148271 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.148335 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv82w\" (UniqueName: \"kubernetes.io/projected/6a145dac-4d02-493c-9bd8-2f9652fcb1d1-kube-api-access-kv82w\") pod \"watcher-operator-controller-manager-564965969-xnrxl\" (UID: \"6a145dac-4d02-493c-9bd8-2f9652fcb1d1\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.148358 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.148422 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k6m8\" (UniqueName: \"kubernetes.io/projected/037bf2ff-dd50-4d62-a525-5304c088cbc0-kube-api-access-5k6m8\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.148519 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.148589 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert podName:c2b6285c-ada4-43f6-8716-53b2afa13723 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:24.148567056 +0000 UTC m=+2307.057286328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" (UID: "c2b6285c-ada4-43f6-8716-53b2afa13723") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.148643 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.148671 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:23.648660819 +0000 UTC m=+2306.557380091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "metrics-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.148964 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.149012 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:23.648999168 +0000 UTC m=+2306.557718440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "webhook-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.168042 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k6m8\" (UniqueName: \"kubernetes.io/projected/037bf2ff-dd50-4d62-a525-5304c088cbc0-kube-api-access-5k6m8\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.177166 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv82w\" (UniqueName: \"kubernetes.io/projected/6a145dac-4d02-493c-9bd8-2f9652fcb1d1-kube-api-access-kv82w\") pod \"watcher-operator-controller-manager-564965969-xnrxl\" (UID: \"6a145dac-4d02-493c-9bd8-2f9652fcb1d1\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.177281 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kll8v\" (UniqueName: \"kubernetes.io/projected/165231a4-c627-484b-9aab-b4ce3feafe7e-kube-api-access-kll8v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ffdr9\" (UID: \"165231a4-c627-484b-9aab-b4ce3feafe7e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.252047 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.273802 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz"] Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.275034 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6"] Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.306637 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.341877 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.376978 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.387082 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" event={"ID":"4e16e340-e213-492a-9c93-851df7b1bddb","Type":"ContainerStarted","Data":"db35a820b3777a5851e8facf3ad0ecbcc7e64fd54a3aced1d804c9fbd5d7246a"} Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.388474 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" event={"ID":"3ffc5e8b-7f7a-4585-b43d-07e2589493c9","Type":"ContainerStarted","Data":"aa4cf6ed4345267a3570795019cb8b05fcff0ac8df1c63c18bd9de1b886b8442"} Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.438297 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.688061 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.688419 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.688492 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:24.688471682 +0000 UTC m=+2307.597190954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "metrics-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.688512 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.688830 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.688863 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:24.688855563 +0000 UTC m=+2307.597574835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "webhook-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.761262 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg"] Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.881704 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c"] Jan 29 17:00:23 crc kubenswrapper[4886]: W0129 17:00:23.893118 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02decfa9_69fb_46b5_8b30_30954e39d411.slice/crio-cd904d745ca033528a23c4f23f61d4912228fb1ee06650bb508b1e3956947400 WatchSource:0}: Error finding container cd904d745ca033528a23c4f23f61d4912228fb1ee06650bb508b1e3956947400: Status 404 returned error can't find the container with id cd904d745ca033528a23c4f23f61d4912228fb1ee06650bb508b1e3956947400 Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.894114 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.894384 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: E0129 17:00:23.894430 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert podName:f2898e34-e423-4576-a765-3919510dcd85 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:25.894415491 +0000 UTC m=+2308.803134763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert") pod "infra-operator-controller-manager-79955696d6-t5n28" (UID: "f2898e34-e423-4576-a765-3919510dcd85") : secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:23 crc kubenswrapper[4886]: I0129 17:00:23.913959 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:23.923148 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.081744 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.089224 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.095303 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.101585 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.202252 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:25 crc kubenswrapper[4886]: E0129 17:00:24.202992 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:25 crc kubenswrapper[4886]: E0129 17:00:24.203050 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert podName:c2b6285c-ada4-43f6-8716-53b2afa13723 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:26.203031856 +0000 UTC m=+2309.111751128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" (UID: "c2b6285c-ada4-43f6-8716-53b2afa13723") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.413512 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" event={"ID":"02decfa9-69fb-46b5-8b30-30954e39d411","Type":"ContainerStarted","Data":"cd904d745ca033528a23c4f23f61d4912228fb1ee06650bb508b1e3956947400"} Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.418715 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" event={"ID":"81b8c703-d895-41ce-8ca3-99fd6b6eecb6","Type":"ContainerStarted","Data":"ce76bb90ce03e73f284ea82f03e266ccc7338861c7bba9795e175aac0b53dd31"} Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.421394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" event={"ID":"4c2d29a3-d017-4e76-9a82-02943a6b38bf","Type":"ContainerStarted","Data":"639db0c86c876ac827c93387845e1cf206d6ed3fed2f43d1aa8357fade4d598f"} Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.430755 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" event={"ID":"70336809-8231-4ed9-a912-8b668aaa53bb","Type":"ContainerStarted","Data":"f7d0ef1be0b7b9e5f87a6132728c16798b8e3959eb5b9c22272745a6c4006e53"} Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.440124 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" event={"ID":"3c56c53e-a292-4e75-b069-c1d06ceeb6c5","Type":"ContainerStarted","Data":"50d9d3ab14eb99b279b44d5ab3871b022f40d978e61529529c73987e5e7fdba4"} Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.445272 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" event={"ID":"c3cbde0f-6b5d-47cf-93e6-3d2e12051aba","Type":"ContainerStarted","Data":"695cc7e2d3be4658991cc89e25b5ee6e17aa1ad185177021e3410bd48a560eb1"} Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.446934 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" event={"ID":"d01e417c-a1b0-445d-83eb-f3c21a492138","Type":"ContainerStarted","Data":"548311bf15facc7ee9df41358726597c099c65c3d7f5e56b972cdfbe9d03afb4"} Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.447965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" event={"ID":"10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0","Type":"ContainerStarted","Data":"65e1ef351905936df764e2e04cb24981be76e4325012871394a549d5a5d20b54"} Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.713730 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:25 crc kubenswrapper[4886]: E0129 17:00:24.713929 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 17:00:25 crc kubenswrapper[4886]: E0129 17:00:24.714008 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:26.713989198 +0000 UTC m=+2309.622708470 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "metrics-server-cert" not found Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:24.714422 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:25 crc kubenswrapper[4886]: E0129 17:00:24.714549 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 17:00:25 crc kubenswrapper[4886]: E0129 17:00:24.714607 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:26.714583705 +0000 UTC m=+2309.623303057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "webhook-server-cert" not found Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.727952 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.735567 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.749972 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.768446 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.784458 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq"] Jan 29 17:00:25 crc kubenswrapper[4886]: W0129 17:00:25.795195 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b52b050_b925_4562_8682_693917b7899c.slice/crio-7db470ef18fc217f81e53aed0aa7446ba74a4fc7176fb2d9b9bcd53bbc32d938 WatchSource:0}: Error finding container 7db470ef18fc217f81e53aed0aa7446ba74a4fc7176fb2d9b9bcd53bbc32d938: Status 404 returned error can't find the container with id 7db470ef18fc217f81e53aed0aa7446ba74a4fc7176fb2d9b9bcd53bbc32d938 Jan 29 17:00:25 crc kubenswrapper[4886]: W0129 17:00:25.796225 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod608c459b_5b47_478a_9e3a_d83d935ae7c7.slice/crio-37cbc13cfbafce0376297b208d259de8971217c0e71b19bb26439a7bfd3d08a9 WatchSource:0}: Error finding container 37cbc13cfbafce0376297b208d259de8971217c0e71b19bb26439a7bfd3d08a9: Status 404 returned error can't find the container with id 37cbc13cfbafce0376297b208d259de8971217c0e71b19bb26439a7bfd3d08a9 Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.800237 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n"] Jan 29 17:00:25 crc kubenswrapper[4886]: W0129 17:00:25.830723 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67107e9f_cf09_4d35_af26_c77f4d76083a.slice/crio-3ef2f289f5e872f27a84dd96f4882804758947fa5161cd292896e231f3b64b0f WatchSource:0}: Error finding container 3ef2f289f5e872f27a84dd96f4882804758947fa5161cd292896e231f3b64b0f: Status 404 returned error can't find the container with id 3ef2f289f5e872f27a84dd96f4882804758947fa5161cd292896e231f3b64b0f Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.857703 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc"] Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.953077 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:25 crc kubenswrapper[4886]: E0129 17:00:25.953315 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:25 crc kubenswrapper[4886]: E0129 17:00:25.953386 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert podName:f2898e34-e423-4576-a765-3919510dcd85 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:29.953371245 +0000 UTC m=+2312.862090507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert") pod "infra-operator-controller-manager-79955696d6-t5n28" (UID: "f2898e34-e423-4576-a765-3919510dcd85") : secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:25 crc kubenswrapper[4886]: I0129 17:00:25.992615 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4"] Jan 29 17:00:26 crc kubenswrapper[4886]: W0129 17:00:26.003344 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db85474_4c59_4db6_ab4a_51092ebd5c62.slice/crio-a410da94c921ce1ac560d29e5bb238702fb864ac3487b73f8e87335e2267b61f WatchSource:0}: Error finding container a410da94c921ce1ac560d29e5bb238702fb864ac3487b73f8e87335e2267b61f: Status 404 returned error can't find the container with id a410da94c921ce1ac560d29e5bb238702fb864ac3487b73f8e87335e2267b61f Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.024269 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-xnrxl"] Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.044984 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9"] Jan 29 17:00:26 crc kubenswrapper[4886]: W0129 17:00:26.054586 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a145dac_4d02_493c_9bd8_2f9652fcb1d1.slice/crio-8d42ceeb5d9bf64a7bed2661af6e701d19abe001843d00ab378a51f2b9af96b1 WatchSource:0}: Error finding container 8d42ceeb5d9bf64a7bed2661af6e701d19abe001843d00ab378a51f2b9af96b1: Status 404 returned error can't find the container with id 8d42ceeb5d9bf64a7bed2661af6e701d19abe001843d00ab378a51f2b9af96b1 Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.260421 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:26 crc kubenswrapper[4886]: E0129 17:00:26.260649 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:26 crc kubenswrapper[4886]: E0129 17:00:26.260700 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert podName:c2b6285c-ada4-43f6-8716-53b2afa13723 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:30.260686463 +0000 UTC m=+2313.169405735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" (UID: "c2b6285c-ada4-43f6-8716-53b2afa13723") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.473424 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" event={"ID":"14d9257b-94ae-4b29-b45a-403e034535d3","Type":"ContainerStarted","Data":"cf6b440152efb9317aca275b6d58dd2b7b288c79058354a01453a7dd476218ea"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.475978 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" event={"ID":"53042ed9-d676-4bb4-bf7b-9b3520aafd12","Type":"ContainerStarted","Data":"aa01ab8a81f918d3213c672f3a8af891e78314708981db6eb9e6c82dc62026ba"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.484501 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" event={"ID":"165231a4-c627-484b-9aab-b4ce3feafe7e","Type":"ContainerStarted","Data":"84c9a06b3d91b965b076c1dc5be61e2fa359472b876e80f6a30ddbd9fbf15160"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.486302 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" event={"ID":"053a2790-370f-44bd-a2c0-603ffb22ed3c","Type":"ContainerStarted","Data":"01edb524f0eecfe97b6696ff1f08b05f06a7d381aeae5df2ddf1a0620edc11c1"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.489155 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" event={"ID":"7b52b050-b925-4562-8682-693917b7899c","Type":"ContainerStarted","Data":"7db470ef18fc217f81e53aed0aa7446ba74a4fc7176fb2d9b9bcd53bbc32d938"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.491257 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" event={"ID":"cbfeb105-c5ee-408e-aac9-e4128e58f0e3","Type":"ContainerStarted","Data":"6992181f56c9dc20f7f0af22476858a99d0fe8af4d0c19429a6eaad302e469cc"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.492593 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" event={"ID":"6a145dac-4d02-493c-9bd8-2f9652fcb1d1","Type":"ContainerStarted","Data":"8d42ceeb5d9bf64a7bed2661af6e701d19abe001843d00ab378a51f2b9af96b1"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.494346 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" event={"ID":"7db85474-4c59-4db6-ab4a-51092ebd5c62","Type":"ContainerStarted","Data":"a410da94c921ce1ac560d29e5bb238702fb864ac3487b73f8e87335e2267b61f"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.495683 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" event={"ID":"67107e9f-cf09-4d35-af26-c77f4d76083a","Type":"ContainerStarted","Data":"3ef2f289f5e872f27a84dd96f4882804758947fa5161cd292896e231f3b64b0f"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.496717 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" event={"ID":"608c459b-5b47-478a-9e3a-d83d935ae7c7","Type":"ContainerStarted","Data":"37cbc13cfbafce0376297b208d259de8971217c0e71b19bb26439a7bfd3d08a9"} Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.769440 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:26 crc kubenswrapper[4886]: I0129 17:00:26.769629 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:26 crc kubenswrapper[4886]: E0129 17:00:26.769636 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 17:00:26 crc kubenswrapper[4886]: E0129 17:00:26.769735 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:30.769696531 +0000 UTC m=+2313.678415803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "webhook-server-cert" not found Jan 29 17:00:26 crc kubenswrapper[4886]: E0129 17:00:26.769846 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 17:00:26 crc kubenswrapper[4886]: E0129 17:00:26.770302 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:30.770293118 +0000 UTC m=+2313.679012390 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "metrics-server-cert" not found Jan 29 17:00:29 crc kubenswrapper[4886]: I0129 17:00:29.660408 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:00:29 crc kubenswrapper[4886]: I0129 17:00:29.660764 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:00:29 crc kubenswrapper[4886]: I0129 17:00:29.660833 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:00:29 crc kubenswrapper[4886]: I0129 17:00:29.661609 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:00:29 crc kubenswrapper[4886]: I0129 17:00:29.661662 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" gracePeriod=600 Jan 29 17:00:30 crc kubenswrapper[4886]: I0129 17:00:30.030642 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:30 crc kubenswrapper[4886]: E0129 17:00:30.030789 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:30 crc kubenswrapper[4886]: E0129 17:00:30.030919 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert podName:f2898e34-e423-4576-a765-3919510dcd85 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:38.030904979 +0000 UTC m=+2320.939624251 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert") pod "infra-operator-controller-manager-79955696d6-t5n28" (UID: "f2898e34-e423-4576-a765-3919510dcd85") : secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:30 crc kubenswrapper[4886]: I0129 17:00:30.335369 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:30 crc kubenswrapper[4886]: E0129 17:00:30.335557 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:30 crc kubenswrapper[4886]: E0129 17:00:30.335667 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert podName:c2b6285c-ada4-43f6-8716-53b2afa13723 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:38.335640775 +0000 UTC m=+2321.244360097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" (UID: "c2b6285c-ada4-43f6-8716-53b2afa13723") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:30 crc kubenswrapper[4886]: I0129 17:00:30.846042 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:30 crc kubenswrapper[4886]: I0129 17:00:30.846232 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:30 crc kubenswrapper[4886]: E0129 17:00:30.846400 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 17:00:30 crc kubenswrapper[4886]: E0129 17:00:30.846421 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 17:00:30 crc kubenswrapper[4886]: E0129 17:00:30.846583 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:38.846540566 +0000 UTC m=+2321.755259988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "metrics-server-cert" not found Jan 29 17:00:30 crc kubenswrapper[4886]: E0129 17:00:30.846684 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:38.846661029 +0000 UTC m=+2321.755380301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "webhook-server-cert" not found Jan 29 17:00:32 crc kubenswrapper[4886]: I0129 17:00:32.558121 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" exitCode=0 Jan 29 17:00:32 crc kubenswrapper[4886]: I0129 17:00:32.558165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc"} Jan 29 17:00:32 crc kubenswrapper[4886]: I0129 17:00:32.559512 4886 scope.go:117] "RemoveContainer" containerID="8ef97582eea2927ab131d16b422621b32afa666846864a223a782bc24fb0ddda" Jan 29 17:00:36 crc kubenswrapper[4886]: E0129 17:00:36.032249 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage611160604/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382" Jan 29 17:00:36 crc kubenswrapper[4886]: E0129 17:00:36.032969 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5rxpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d9697b7f4-rhxnz_openstack-operators(d01e417c-a1b0-445d-83eb-f3c21a492138): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage611160604/1\": happened during read: context canceled" logger="UnhandledError" Jan 29 17:00:36 crc kubenswrapper[4886]: E0129 17:00:36.034376 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage611160604/1\\\": happened during read: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" podUID="d01e417c-a1b0-445d-83eb-f3c21a492138" Jan 29 17:00:36 crc kubenswrapper[4886]: E0129 17:00:36.601952 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" podUID="d01e417c-a1b0-445d-83eb-f3c21a492138" Jan 29 17:00:38 crc kubenswrapper[4886]: I0129 17:00:38.071137 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:38 crc kubenswrapper[4886]: E0129 17:00:38.071338 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:38 crc kubenswrapper[4886]: E0129 17:00:38.071651 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert podName:f2898e34-e423-4576-a765-3919510dcd85 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:54.071631731 +0000 UTC m=+2336.980351013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert") pod "infra-operator-controller-manager-79955696d6-t5n28" (UID: "f2898e34-e423-4576-a765-3919510dcd85") : secret "infra-operator-webhook-server-cert" not found Jan 29 17:00:38 crc kubenswrapper[4886]: I0129 17:00:38.378147 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:38 crc kubenswrapper[4886]: E0129 17:00:38.378702 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:38 crc kubenswrapper[4886]: E0129 17:00:38.378879 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert podName:c2b6285c-ada4-43f6-8716-53b2afa13723 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:54.378855857 +0000 UTC m=+2337.287575139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" (UID: "c2b6285c-ada4-43f6-8716-53b2afa13723") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 17:00:38 crc kubenswrapper[4886]: I0129 17:00:38.887334 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:38 crc kubenswrapper[4886]: I0129 17:00:38.887494 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:38 crc kubenswrapper[4886]: E0129 17:00:38.887588 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 17:00:38 crc kubenswrapper[4886]: E0129 17:00:38.887665 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:54.887646639 +0000 UTC m=+2337.796365911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "metrics-server-cert" not found Jan 29 17:00:38 crc kubenswrapper[4886]: E0129 17:00:38.887682 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 17:00:38 crc kubenswrapper[4886]: E0129 17:00:38.887731 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs podName:037bf2ff-dd50-4d62-a525-5304c088cbc0 nodeName:}" failed. No retries permitted until 2026-01-29 17:00:54.88771622 +0000 UTC m=+2337.796435492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs") pod "openstack-operator-controller-manager-546c7b8b6d-hngs4" (UID: "037bf2ff-dd50-4d62-a525-5304c088cbc0") : secret "webhook-server-cert" not found Jan 29 17:00:40 crc kubenswrapper[4886]: E0129 17:00:40.023495 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898" Jan 29 17:00:40 crc kubenswrapper[4886]: E0129 17:00:40.023682 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-244l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d874c8fc-w6qc6_openstack-operators(4e16e340-e213-492a-9c93-851df7b1bddb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:00:40 crc kubenswrapper[4886]: E0129 17:00:40.024863 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" podUID="4e16e340-e213-492a-9c93-851df7b1bddb" Jan 29 17:00:40 crc kubenswrapper[4886]: E0129 17:00:40.457333 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c" Jan 29 17:00:40 crc kubenswrapper[4886]: E0129 17:00:40.457525 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mjlpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7b6c4d8c5f-2g2cz_openstack-operators(3ffc5e8b-7f7a-4585-b43d-07e2589493c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:00:40 crc kubenswrapper[4886]: E0129 17:00:40.459387 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" podUID="3ffc5e8b-7f7a-4585-b43d-07e2589493c9" Jan 29 17:00:40 crc kubenswrapper[4886]: E0129 17:00:40.638527 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" podUID="3ffc5e8b-7f7a-4585-b43d-07e2589493c9" Jan 29 17:00:40 crc kubenswrapper[4886]: E0129 17:00:40.638750 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" podUID="4e16e340-e213-492a-9c93-851df7b1bddb" Jan 29 17:00:46 crc kubenswrapper[4886]: E0129 17:00:46.415508 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd: Get \"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd\": context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 29 17:00:46 crc kubenswrapper[4886]: E0129 17:00:46.416151 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kll8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ffdr9_openstack-operators(165231a4-c627-484b-9aab-b4ce3feafe7e): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd: Get \"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd\": context canceled" logger="UnhandledError" Jan 29 17:00:46 crc kubenswrapper[4886]: E0129 17:00:46.417773 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd: Get \\\"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd\\\": context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" podUID="165231a4-c627-484b-9aab-b4ce3feafe7e" Jan 29 17:00:46 crc kubenswrapper[4886]: E0129 17:00:46.727884 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" podUID="165231a4-c627-484b-9aab-b4ce3feafe7e" Jan 29 17:00:47 crc kubenswrapper[4886]: E0129 17:00:47.291494 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566" Jan 29 17:00:47 crc kubenswrapper[4886]: E0129 17:00:47.291681 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdh8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7dd968899f-zpgq2_openstack-operators(70336809-8231-4ed9-a912-8b668aaa53bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:00:47 crc kubenswrapper[4886]: E0129 17:00:47.293226 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" podUID="70336809-8231-4ed9-a912-8b668aaa53bb" Jan 29 17:00:47 crc kubenswrapper[4886]: E0129 17:00:47.749896 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" podUID="70336809-8231-4ed9-a912-8b668aaa53bb" Jan 29 17:00:47 crc kubenswrapper[4886]: E0129 17:00:47.848410 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4" Jan 29 17:00:47 crc kubenswrapper[4886]: E0129 17:00:47.848600 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4hvpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-xnccq_openstack-operators(14d9257b-94ae-4b29-b45a-403e034535d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:00:47 crc kubenswrapper[4886]: E0129 17:00:47.849785 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" podUID="14d9257b-94ae-4b29-b45a-403e034535d3" Jan 29 17:00:48 crc kubenswrapper[4886]: E0129 17:00:48.553723 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:00:48 crc kubenswrapper[4886]: I0129 17:00:48.753970 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:00:48 crc kubenswrapper[4886]: E0129 17:00:48.754225 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:00:48 crc kubenswrapper[4886]: E0129 17:00:48.862077 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" podUID="14d9257b-94ae-4b29-b45a-403e034535d3" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.074969 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.081923 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2898e34-e423-4576-a765-3919510dcd85-cert\") pod \"infra-operator-controller-manager-79955696d6-t5n28\" (UID: \"f2898e34-e423-4576-a765-3919510dcd85\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.362820 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.381911 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.385778 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b6285c-ada4-43f6-8716-53b2afa13723-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh\" (UID: \"c2b6285c-ada4-43f6-8716-53b2afa13723\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.406276 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.895368 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.895784 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.902356 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-webhook-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.903543 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/037bf2ff-dd50-4d62-a525-5304c088cbc0-metrics-certs\") pod \"openstack-operator-controller-manager-546c7b8b6d-hngs4\" (UID: \"037bf2ff-dd50-4d62-a525-5304c088cbc0\") " pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:54 crc kubenswrapper[4886]: I0129 17:00:54.917360 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:00:57 crc kubenswrapper[4886]: E0129 17:00:57.071493 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10" Jan 29 17:00:57 crc kubenswrapper[4886]: E0129 17:00:57.072060 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59v5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-69d6db494d-qf2xg_openstack-operators(3c56c53e-a292-4e75-b069-c1d06ceeb6c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:00:57 crc kubenswrapper[4886]: E0129 17:00:57.073214 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" podUID="3c56c53e-a292-4e75-b069-c1d06ceeb6c5" Jan 29 17:00:57 crc kubenswrapper[4886]: E0129 17:00:57.841180 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" podUID="3c56c53e-a292-4e75-b069-c1d06ceeb6c5" Jan 29 17:01:00 crc kubenswrapper[4886]: I0129 17:01:00.032273 4886 scope.go:117] "RemoveContainer" containerID="e24030b3765055e623ca669573f5fe2306c10abdab283e014f331f200998a684" Jan 29 17:01:00 crc kubenswrapper[4886]: E0129 17:01:00.058225 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488" Jan 29 17:01:00 crc kubenswrapper[4886]: E0129 17:01:00.058418 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s8fvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-xt9wq_openstack-operators(53042ed9-d676-4bb4-bf7b-9b3520aafd12): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:00 crc kubenswrapper[4886]: E0129 17:01:00.059737 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" podUID="53042ed9-d676-4bb4-bf7b-9b3520aafd12" Jan 29 17:01:00 crc kubenswrapper[4886]: E0129 17:01:00.864071 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" podUID="53042ed9-d676-4bb4-bf7b-9b3520aafd12" Jan 29 17:01:02 crc kubenswrapper[4886]: E0129 17:01:02.101540 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8" Jan 29 17:01:02 crc kubenswrapper[4886]: E0129 17:01:02.101989 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4t2vc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5fb775575f-4mmm8_openstack-operators(81b8c703-d895-41ce-8ca3-99fd6b6eecb6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:02 crc kubenswrapper[4886]: E0129 17:01:02.103599 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" podUID="81b8c703-d895-41ce-8ca3-99fd6b6eecb6" Jan 29 17:01:02 crc kubenswrapper[4886]: I0129 17:01:02.614808 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:01:02 crc kubenswrapper[4886]: E0129 17:01:02.615064 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:01:03 crc kubenswrapper[4886]: E0129 17:01:03.132984 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1f593e8d49d02b6484c89632192ae54771675c54fbd8426e3675b8e20ecfd7c4" Jan 29 17:01:03 crc kubenswrapper[4886]: E0129 17:01:03.133161 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1f593e8d49d02b6484c89632192ae54771675c54fbd8426e3675b8e20ecfd7c4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wckhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-8886f4c47-pfw9c_openstack-operators(02decfa9-69fb-46b5-8b30-30954e39d411): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:03 crc kubenswrapper[4886]: E0129 17:01:03.135266 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" podUID="02decfa9-69fb-46b5-8b30-30954e39d411" Jan 29 17:01:03 crc kubenswrapper[4886]: E0129 17:01:03.411164 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" podUID="81b8c703-d895-41ce-8ca3-99fd6b6eecb6" Jan 29 17:01:03 crc kubenswrapper[4886]: E0129 17:01:03.888665 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1f593e8d49d02b6484c89632192ae54771675c54fbd8426e3675b8e20ecfd7c4\\\"\"" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" podUID="02decfa9-69fb-46b5-8b30-30954e39d411" Jan 29 17:01:06 crc kubenswrapper[4886]: E0129 17:01:06.709849 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b" Jan 29 17:01:06 crc kubenswrapper[4886]: E0129 17:01:06.711186 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kv82w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-xnrxl_openstack-operators(6a145dac-4d02-493c-9bd8-2f9652fcb1d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:06 crc kubenswrapper[4886]: E0129 17:01:06.712534 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" podUID="6a145dac-4d02-493c-9bd8-2f9652fcb1d1" Jan 29 17:01:06 crc kubenswrapper[4886]: E0129 17:01:06.930025 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" podUID="6a145dac-4d02-493c-9bd8-2f9652fcb1d1" Jan 29 17:01:07 crc kubenswrapper[4886]: E0129 17:01:07.810891 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382" Jan 29 17:01:07 crc kubenswrapper[4886]: E0129 17:01:07.811439 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tpd7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-cmfj2_openstack-operators(608c459b-5b47-478a-9e3a-d83d935ae7c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:07 crc kubenswrapper[4886]: E0129 17:01:07.812706 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" podUID="608c459b-5b47-478a-9e3a-d83d935ae7c7" Jan 29 17:01:07 crc kubenswrapper[4886]: E0129 17:01:07.917404 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" podUID="608c459b-5b47-478a-9e3a-d83d935ae7c7" Jan 29 17:01:13 crc kubenswrapper[4886]: I0129 17:01:13.616227 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:01:13 crc kubenswrapper[4886]: E0129 17:01:13.617189 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:01:14 crc kubenswrapper[4886]: E0129 17:01:14.313730 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6" Jan 29 17:01:14 crc kubenswrapper[4886]: E0129 17:01:14.313947 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z6xj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-585dbc889-9zqmc_openstack-operators(053a2790-370f-44bd-a2c0-603ffb22ed3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:14 crc kubenswrapper[4886]: E0129 17:01:14.315121 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" podUID="053a2790-370f-44bd-a2c0-603ffb22ed3c" Jan 29 17:01:14 crc kubenswrapper[4886]: E0129 17:01:14.975303 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" podUID="053a2790-370f-44bd-a2c0-603ffb22ed3c" Jan 29 17:01:15 crc kubenswrapper[4886]: E0129 17:01:15.583103 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be" Jan 29 17:01:15 crc kubenswrapper[4886]: E0129 17:01:15.583303 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lmzzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-8gq2g_openstack-operators(7b52b050-b925-4562-8682-693917b7899c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:15 crc kubenswrapper[4886]: E0129 17:01:15.584606 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" podUID="7b52b050-b925-4562-8682-693917b7899c" Jan 29 17:01:15 crc kubenswrapper[4886]: E0129 17:01:15.984731 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" podUID="7b52b050-b925-4562-8682-693917b7899c" Jan 29 17:01:16 crc kubenswrapper[4886]: E0129 17:01:16.742897 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566" Jan 29 17:01:16 crc kubenswrapper[4886]: E0129 17:01:16.743106 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdh8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7dd968899f-zpgq2_openstack-operators(70336809-8231-4ed9-a912-8b668aaa53bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:16 crc kubenswrapper[4886]: E0129 17:01:16.744372 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" podUID="70336809-8231-4ed9-a912-8b668aaa53bb" Jan 29 17:01:16 crc kubenswrapper[4886]: E0129 17:01:16.751570 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241" Jan 29 17:01:16 crc kubenswrapper[4886]: E0129 17:01:16.751739 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p5c9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-hf95f_openstack-operators(cbfeb105-c5ee-408e-aac9-e4128e58f0e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:16 crc kubenswrapper[4886]: E0129 17:01:16.752948 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" podUID="cbfeb105-c5ee-408e-aac9-e4128e58f0e3" Jan 29 17:01:17 crc kubenswrapper[4886]: E0129 17:01:16.992211 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" podUID="cbfeb105-c5ee-408e-aac9-e4128e58f0e3" Jan 29 17:01:17 crc kubenswrapper[4886]: E0129 17:01:17.682949 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf" Jan 29 17:01:17 crc kubenswrapper[4886]: E0129 17:01:17.683149 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pw5nj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf948998-c4j5s_openstack-operators(4c2d29a3-d017-4e76-9a82-02943a6b38bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:17 crc kubenswrapper[4886]: E0129 17:01:17.684339 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" podUID="4c2d29a3-d017-4e76-9a82-02943a6b38bf" Jan 29 17:01:17 crc kubenswrapper[4886]: E0129 17:01:17.997974 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" podUID="4c2d29a3-d017-4e76-9a82-02943a6b38bf" Jan 29 17:01:18 crc kubenswrapper[4886]: E0129 17:01:18.529598 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e" Jan 29 17:01:18 crc kubenswrapper[4886]: E0129 17:01:18.530479 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tvgcm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-dxcgn_openstack-operators(c3cbde0f-6b5d-47cf-93e6-3d2e12051aba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:18 crc kubenswrapper[4886]: E0129 17:01:18.532534 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" podUID="c3cbde0f-6b5d-47cf-93e6-3d2e12051aba" Jan 29 17:01:19 crc kubenswrapper[4886]: E0129 17:01:19.010898 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" podUID="c3cbde0f-6b5d-47cf-93e6-3d2e12051aba" Jan 29 17:01:19 crc kubenswrapper[4886]: E0129 17:01:19.761877 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.200:5001/openstack-k8s-operators/telemetry-operator:0e065ec457961704e9d1c504e4175b5fe8df623e" Jan 29 17:01:19 crc kubenswrapper[4886]: E0129 17:01:19.761937 4886 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.200:5001/openstack-k8s-operators/telemetry-operator:0e065ec457961704e9d1c504e4175b5fe8df623e" Jan 29 17:01:19 crc kubenswrapper[4886]: E0129 17:01:19.762107 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.200:5001/openstack-k8s-operators/telemetry-operator:0e065ec457961704e9d1c504e4175b5fe8df623e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wcgmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-75495fd598-2hpj4_openstack-operators(7db85474-4c59-4db6-ab4a-51092ebd5c62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:19 crc kubenswrapper[4886]: E0129 17:01:19.763408 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" podUID="7db85474-4c59-4db6-ab4a-51092ebd5c62" Jan 29 17:01:20 crc kubenswrapper[4886]: E0129 17:01:20.024976 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.200:5001/openstack-k8s-operators/telemetry-operator:0e065ec457961704e9d1c504e4175b5fe8df623e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" podUID="7db85474-4c59-4db6-ab4a-51092ebd5c62" Jan 29 17:01:21 crc kubenswrapper[4886]: E0129 17:01:21.018740 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382" Jan 29 17:01:21 crc kubenswrapper[4886]: E0129 17:01:21.018949 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5rxpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d9697b7f4-rhxnz_openstack-operators(d01e417c-a1b0-445d-83eb-f3c21a492138): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:21 crc kubenswrapper[4886]: E0129 17:01:21.020178 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" podUID="d01e417c-a1b0-445d-83eb-f3c21a492138" Jan 29 17:01:22 crc kubenswrapper[4886]: E0129 17:01:22.882313 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17" Jan 29 17:01:22 crc kubenswrapper[4886]: E0129 17:01:22.883027 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5skg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-kwr4n_openstack-operators(67107e9f-cf09-4d35-af26-c77f4d76083a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:22 crc kubenswrapper[4886]: E0129 17:01:22.884233 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" podUID="67107e9f-cf09-4d35-af26-c77f4d76083a" Jan 29 17:01:23 crc kubenswrapper[4886]: E0129 17:01:23.840359 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" podUID="67107e9f-cf09-4d35-af26-c77f4d76083a" Jan 29 17:01:25 crc kubenswrapper[4886]: I0129 17:01:25.615552 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:01:25 crc kubenswrapper[4886]: E0129 17:01:25.616427 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:01:27 crc kubenswrapper[4886]: E0129 17:01:27.848543 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 29 17:01:27 crc kubenswrapper[4886]: E0129 17:01:27.849110 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kll8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ffdr9_openstack-operators(165231a4-c627-484b-9aab-b4ce3feafe7e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:01:27 crc kubenswrapper[4886]: E0129 17:01:27.850719 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" podUID="165231a4-c627-484b-9aab-b4ce3feafe7e" Jan 29 17:01:28 crc kubenswrapper[4886]: I0129 17:01:28.380175 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-t5n28"] Jan 29 17:01:28 crc kubenswrapper[4886]: W0129 17:01:28.403441 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2898e34_e423_4576_a765_3919510dcd85.slice/crio-32bad54df0a05d379850970b4bc6fa4c00d6a1b6eec5ddf09b64a9bc7353231b WatchSource:0}: Error finding container 32bad54df0a05d379850970b4bc6fa4c00d6a1b6eec5ddf09b64a9bc7353231b: Status 404 returned error can't find the container with id 32bad54df0a05d379850970b4bc6fa4c00d6a1b6eec5ddf09b64a9bc7353231b Jan 29 17:01:28 crc kubenswrapper[4886]: I0129 17:01:28.471756 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4"] Jan 29 17:01:28 crc kubenswrapper[4886]: I0129 17:01:28.614314 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh"] Jan 29 17:01:28 crc kubenswrapper[4886]: E0129 17:01:28.647678 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" podUID="70336809-8231-4ed9-a912-8b668aaa53bb" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.107488 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" event={"ID":"53042ed9-d676-4bb4-bf7b-9b3520aafd12","Type":"ContainerStarted","Data":"08631cad71c683ae7bc93ea38f8ec2a7efbc6831d0396ac48aebe884fe6bbe1c"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.108193 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.109417 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" event={"ID":"02decfa9-69fb-46b5-8b30-30954e39d411","Type":"ContainerStarted","Data":"3ba7acc051744e2e2125dd34e8289a04c4077f3f8fb45115cbb4dd6735c52ec1"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.109655 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.111134 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" event={"ID":"608c459b-5b47-478a-9e3a-d83d935ae7c7","Type":"ContainerStarted","Data":"27f7ecc14812bb16e02a66d7d60e9cacc89d5b8c40c2ffcd19146ed9cbcb9221"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.111351 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.112755 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" event={"ID":"037bf2ff-dd50-4d62-a525-5304c088cbc0","Type":"ContainerStarted","Data":"c19dc8dbddb237b0be234b39305b171ff3c8fede1daf2a27f71662844567e30c"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.112842 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" event={"ID":"037bf2ff-dd50-4d62-a525-5304c088cbc0","Type":"ContainerStarted","Data":"10c75283c7e6e3cd50f8debdbf1161ce254cb6c87ee175d8a1e5bd1d6ca877ea"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.112904 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.114410 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" event={"ID":"4e16e340-e213-492a-9c93-851df7b1bddb","Type":"ContainerStarted","Data":"529906ac788956a959a6dfa38ad9145f4e162db09f249ae9aa26a562137a393c"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.114622 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.115965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" event={"ID":"c2b6285c-ada4-43f6-8716-53b2afa13723","Type":"ContainerStarted","Data":"b0e418cb46ad17eb310f510a0a59751fbe5a247c55458b00418987b4f06bd783"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.117527 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" event={"ID":"6a145dac-4d02-493c-9bd8-2f9652fcb1d1","Type":"ContainerStarted","Data":"22f5ed753cfd8c08f1ff897163e136a0a25b32b0b9a1dbe8a68f3848234f1080"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.117778 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.119147 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" event={"ID":"14d9257b-94ae-4b29-b45a-403e034535d3","Type":"ContainerStarted","Data":"2a4a6ae6649f5fad516026d403eb47836c1f46f5b814d464817c8ac459496def"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.119366 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.120544 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" event={"ID":"10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0","Type":"ContainerStarted","Data":"8f65abf262a8949c0e08aae4a5f9b50c87e9a4fa88a2936ca60a659c65ed12cb"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.120704 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.121728 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" event={"ID":"f2898e34-e423-4576-a765-3919510dcd85","Type":"ContainerStarted","Data":"32bad54df0a05d379850970b4bc6fa4c00d6a1b6eec5ddf09b64a9bc7353231b"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.123233 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" event={"ID":"81b8c703-d895-41ce-8ca3-99fd6b6eecb6","Type":"ContainerStarted","Data":"50db265bbf35a4d0586f20b78bb6755925fb4c2fcdc76b9f9b71ed13398cf4e2"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.123474 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.124945 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" event={"ID":"3ffc5e8b-7f7a-4585-b43d-07e2589493c9","Type":"ContainerStarted","Data":"4583adefe889d5e4fa04809fe08e963718545c998f60131fec2ccfff152ec10b"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.125079 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.126471 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" event={"ID":"3c56c53e-a292-4e75-b069-c1d06ceeb6c5","Type":"ContainerStarted","Data":"57f0d638acef9226c1817c1099045f15651a37faa50324df6baf8fd5d16315a3"} Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.126619 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.266341 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" podStartSLOduration=5.225412994 podStartE2EDuration="1m7.266309238s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:25.835315177 +0000 UTC m=+2308.744034449" lastFinishedPulling="2026-01-29 17:01:27.876211421 +0000 UTC m=+2370.784930693" observedRunningTime="2026-01-29 17:01:29.265259109 +0000 UTC m=+2372.173978381" watchObservedRunningTime="2026-01-29 17:01:29.266309238 +0000 UTC m=+2372.175028510" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.334932 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" podStartSLOduration=5.497004679 podStartE2EDuration="1m7.334915526s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:26.05993329 +0000 UTC m=+2308.968652562" lastFinishedPulling="2026-01-29 17:01:27.897844137 +0000 UTC m=+2370.806563409" observedRunningTime="2026-01-29 17:01:29.332899591 +0000 UTC m=+2372.241618863" watchObservedRunningTime="2026-01-29 17:01:29.334915526 +0000 UTC m=+2372.243634798" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.474723 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" podStartSLOduration=5.925280054 podStartE2EDuration="1m8.474704266s" podCreationTimestamp="2026-01-29 17:00:21 +0000 UTC" firstStartedPulling="2026-01-29 17:00:23.341547311 +0000 UTC m=+2306.250266583" lastFinishedPulling="2026-01-29 17:01:25.890971513 +0000 UTC m=+2368.799690795" observedRunningTime="2026-01-29 17:01:29.417953763 +0000 UTC m=+2372.326673035" watchObservedRunningTime="2026-01-29 17:01:29.474704266 +0000 UTC m=+2372.383423538" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.558481 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" podStartSLOduration=5.429190629 podStartE2EDuration="1m7.558461423s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:25.807758062 +0000 UTC m=+2308.716477344" lastFinishedPulling="2026-01-29 17:01:27.937028866 +0000 UTC m=+2370.845748138" observedRunningTime="2026-01-29 17:01:29.556019386 +0000 UTC m=+2372.464738678" watchObservedRunningTime="2026-01-29 17:01:29.558461423 +0000 UTC m=+2372.467180695" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.560737 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" podStartSLOduration=4.597201529 podStartE2EDuration="1m8.560726105s" podCreationTimestamp="2026-01-29 17:00:21 +0000 UTC" firstStartedPulling="2026-01-29 17:00:23.929519418 +0000 UTC m=+2306.838238680" lastFinishedPulling="2026-01-29 17:01:27.893043984 +0000 UTC m=+2370.801763256" observedRunningTime="2026-01-29 17:01:29.473711069 +0000 UTC m=+2372.382430341" watchObservedRunningTime="2026-01-29 17:01:29.560726105 +0000 UTC m=+2372.469445377" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.637387 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" podStartSLOduration=67.637366376 podStartE2EDuration="1m7.637366376s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:01:29.607618557 +0000 UTC m=+2372.516337829" watchObservedRunningTime="2026-01-29 17:01:29.637366376 +0000 UTC m=+2372.546085658" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.672910 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" podStartSLOduration=6.123086582 podStartE2EDuration="1m8.672888985s" podCreationTimestamp="2026-01-29 17:00:21 +0000 UTC" firstStartedPulling="2026-01-29 17:00:23.341275733 +0000 UTC m=+2306.249995005" lastFinishedPulling="2026-01-29 17:01:25.891078096 +0000 UTC m=+2368.799797408" observedRunningTime="2026-01-29 17:01:29.636013119 +0000 UTC m=+2372.544732401" watchObservedRunningTime="2026-01-29 17:01:29.672888985 +0000 UTC m=+2372.581608257" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.679250 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" podStartSLOduration=4.687946172 podStartE2EDuration="1m8.679233869s" podCreationTimestamp="2026-01-29 17:00:21 +0000 UTC" firstStartedPulling="2026-01-29 17:00:23.901742257 +0000 UTC m=+2306.810461529" lastFinishedPulling="2026-01-29 17:01:27.893029954 +0000 UTC m=+2370.801749226" observedRunningTime="2026-01-29 17:01:29.654607381 +0000 UTC m=+2372.563326653" watchObservedRunningTime="2026-01-29 17:01:29.679233869 +0000 UTC m=+2372.587953141" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.701082 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" podStartSLOduration=4.686906142 podStartE2EDuration="1m8.701063561s" podCreationTimestamp="2026-01-29 17:00:21 +0000 UTC" firstStartedPulling="2026-01-29 17:00:23.791895589 +0000 UTC m=+2306.700614861" lastFinishedPulling="2026-01-29 17:01:27.806052988 +0000 UTC m=+2370.714772280" observedRunningTime="2026-01-29 17:01:29.69593831 +0000 UTC m=+2372.604657582" watchObservedRunningTime="2026-01-29 17:01:29.701063561 +0000 UTC m=+2372.609782833" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.723103 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" podStartSLOduration=5.724339838 podStartE2EDuration="1m7.723087767s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:25.807725441 +0000 UTC m=+2308.716444713" lastFinishedPulling="2026-01-29 17:01:27.80647336 +0000 UTC m=+2370.715192642" observedRunningTime="2026-01-29 17:01:29.718636805 +0000 UTC m=+2372.627356077" watchObservedRunningTime="2026-01-29 17:01:29.723087767 +0000 UTC m=+2372.631807039" Jan 29 17:01:29 crc kubenswrapper[4886]: I0129 17:01:29.743221 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" podStartSLOduration=12.300214887 podStartE2EDuration="1m7.743202161s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:24.08576721 +0000 UTC m=+2306.994486482" lastFinishedPulling="2026-01-29 17:01:19.528754484 +0000 UTC m=+2362.437473756" observedRunningTime="2026-01-29 17:01:29.735060637 +0000 UTC m=+2372.643779909" watchObservedRunningTime="2026-01-29 17:01:29.743202161 +0000 UTC m=+2372.651921433" Jan 29 17:01:33 crc kubenswrapper[4886]: I0129 17:01:33.256841 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-xt9wq" Jan 29 17:01:33 crc kubenswrapper[4886]: I0129 17:01:33.380975 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-xnrxl" Jan 29 17:01:33 crc kubenswrapper[4886]: E0129 17:01:33.616952 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" podUID="d01e417c-a1b0-445d-83eb-f3c21a492138" Jan 29 17:01:34 crc kubenswrapper[4886]: I0129 17:01:34.924011 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-546c7b8b6d-hngs4" Jan 29 17:01:39 crc kubenswrapper[4886]: I0129 17:01:39.615582 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:01:39 crc kubenswrapper[4886]: E0129 17:01:39.616440 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.246680 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" event={"ID":"c2b6285c-ada4-43f6-8716-53b2afa13723","Type":"ContainerStarted","Data":"1eeec1940f0358f8bf1517780cc09baefe598dce219e723b21d9e385c74fe04b"} Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.247285 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.248388 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" event={"ID":"c3cbde0f-6b5d-47cf-93e6-3d2e12051aba","Type":"ContainerStarted","Data":"8e7e6c945083aad52b225a07e909c682655ca5a70c8963d43b4952ec8ca4b612"} Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.248589 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.251874 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" event={"ID":"f2898e34-e423-4576-a765-3919510dcd85","Type":"ContainerStarted","Data":"d6cab8e8cbc1ca14b2c6e02750c867850499ec9790828f8ee283de7d764ea83d"} Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.254091 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" event={"ID":"053a2790-370f-44bd-a2c0-603ffb22ed3c","Type":"ContainerStarted","Data":"1898f8e239a7d5c23d50a83a89acce63c295993247ee81f49a76afabc303731c"} Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.254376 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.255932 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" event={"ID":"7db85474-4c59-4db6-ab4a-51092ebd5c62","Type":"ContainerStarted","Data":"8c637077bf4d9ad051c8d079b2d61c33cfa17c707c30487dbed27b7dd2bf5baf"} Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.256550 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.262961 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" event={"ID":"7b52b050-b925-4562-8682-693917b7899c","Type":"ContainerStarted","Data":"96acbcf5a952263baae2b5f40a51d7232b4238dcfd6172b4c09e0687a80ea6f6"} Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.263744 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.271269 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" event={"ID":"4c2d29a3-d017-4e76-9a82-02943a6b38bf","Type":"ContainerStarted","Data":"428345c51b77565a0a046dbcc4a2a80cf710824db549ca179d99c2c267860cd4"} Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.271933 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.275365 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" event={"ID":"cbfeb105-c5ee-408e-aac9-e4128e58f0e3","Type":"ContainerStarted","Data":"e53988331fb9322ad0fc5d89fe33040c2ede7e1105074dcb85a5b0b441bfd1ef"} Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.275773 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.310280 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" podStartSLOduration=67.768428637 podStartE2EDuration="1m19.310257993s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:01:28.642007313 +0000 UTC m=+2371.550726585" lastFinishedPulling="2026-01-29 17:01:40.183836669 +0000 UTC m=+2383.092555941" observedRunningTime="2026-01-29 17:01:41.28764109 +0000 UTC m=+2384.196360362" watchObservedRunningTime="2026-01-29 17:01:41.310257993 +0000 UTC m=+2384.218977265" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.353996 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" podStartSLOduration=5.06165201 podStartE2EDuration="1m19.353976437s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:25.830886882 +0000 UTC m=+2308.739606154" lastFinishedPulling="2026-01-29 17:01:40.123211309 +0000 UTC m=+2383.031930581" observedRunningTime="2026-01-29 17:01:41.320040452 +0000 UTC m=+2384.228759724" watchObservedRunningTime="2026-01-29 17:01:41.353976437 +0000 UTC m=+2384.262695719" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.354327 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" podStartSLOduration=5.177890842 podStartE2EDuration="1m19.354316557s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:26.007492856 +0000 UTC m=+2308.916212128" lastFinishedPulling="2026-01-29 17:01:40.183918571 +0000 UTC m=+2383.092637843" observedRunningTime="2026-01-29 17:01:41.345078112 +0000 UTC m=+2384.253797384" watchObservedRunningTime="2026-01-29 17:01:41.354316557 +0000 UTC m=+2384.263035829" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.365013 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" podStartSLOduration=5.021819874 podStartE2EDuration="1m19.364995521s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:25.79558658 +0000 UTC m=+2308.704305852" lastFinishedPulling="2026-01-29 17:01:40.138762227 +0000 UTC m=+2383.047481499" observedRunningTime="2026-01-29 17:01:41.359255403 +0000 UTC m=+2384.267974685" watchObservedRunningTime="2026-01-29 17:01:41.364995521 +0000 UTC m=+2384.273714793" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.383794 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" podStartSLOduration=3.272946811 podStartE2EDuration="1m19.383777958s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:24.073782703 +0000 UTC m=+2306.982501975" lastFinishedPulling="2026-01-29 17:01:40.18461385 +0000 UTC m=+2383.093333122" observedRunningTime="2026-01-29 17:01:41.376370084 +0000 UTC m=+2384.285089356" watchObservedRunningTime="2026-01-29 17:01:41.383777958 +0000 UTC m=+2384.292497230" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.412067 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" podStartSLOduration=3.335934165 podStartE2EDuration="1m19.412049257s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:24.108265662 +0000 UTC m=+2307.016984934" lastFinishedPulling="2026-01-29 17:01:40.184380754 +0000 UTC m=+2383.093100026" observedRunningTime="2026-01-29 17:01:41.404637212 +0000 UTC m=+2384.313356484" watchObservedRunningTime="2026-01-29 17:01:41.412049257 +0000 UTC m=+2384.320768529" Jan 29 17:01:41 crc kubenswrapper[4886]: I0129 17:01:41.430883 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" podStartSLOduration=5.066415954 podStartE2EDuration="1m19.430867255s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:25.834925776 +0000 UTC m=+2308.743645048" lastFinishedPulling="2026-01-29 17:01:40.199377077 +0000 UTC m=+2383.108096349" observedRunningTime="2026-01-29 17:01:41.429012894 +0000 UTC m=+2384.337732166" watchObservedRunningTime="2026-01-29 17:01:41.430867255 +0000 UTC m=+2384.339586527" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.280213 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-2g2cz" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.284422 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.303620 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-w6qc6" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.339007 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" podStartSLOduration=68.553083419 podStartE2EDuration="1m20.338983487s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:01:28.409705235 +0000 UTC m=+2371.318424507" lastFinishedPulling="2026-01-29 17:01:40.195605303 +0000 UTC m=+2383.104324575" observedRunningTime="2026-01-29 17:01:42.324690833 +0000 UTC m=+2385.233410115" watchObservedRunningTime="2026-01-29 17:01:42.338983487 +0000 UTC m=+2385.247702759" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.478253 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-pfw9c" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.508272 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4mmm8" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.583041 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qf2xg" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.613892 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-77z62" Jan 29 17:01:42 crc kubenswrapper[4886]: E0129 17:01:42.620988 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" podUID="165231a4-c627-484b-9aab-b4ce3feafe7e" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.928100 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-xnccq" Jan 29 17:01:42 crc kubenswrapper[4886]: I0129 17:01:42.999073 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-cmfj2" Jan 29 17:01:43 crc kubenswrapper[4886]: I0129 17:01:43.294185 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" event={"ID":"70336809-8231-4ed9-a912-8b668aaa53bb","Type":"ContainerStarted","Data":"98ae179bdcc94a3c5aec25014bf612acbff88d1aaff9be2ab0f329e78cbb5105"} Jan 29 17:01:43 crc kubenswrapper[4886]: I0129 17:01:43.294381 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" Jan 29 17:01:43 crc kubenswrapper[4886]: I0129 17:01:43.297286 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" event={"ID":"67107e9f-cf09-4d35-af26-c77f4d76083a","Type":"ContainerStarted","Data":"d9fb8173587b39cf7aff6ed09fabb9e71bf83f66ea01fee608a23870907d7be6"} Jan 29 17:01:43 crc kubenswrapper[4886]: I0129 17:01:43.297785 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" Jan 29 17:01:43 crc kubenswrapper[4886]: I0129 17:01:43.316262 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" podStartSLOduration=3.026945847 podStartE2EDuration="1m21.316243583s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:24.112535892 +0000 UTC m=+2307.021255164" lastFinishedPulling="2026-01-29 17:01:42.401833628 +0000 UTC m=+2385.310552900" observedRunningTime="2026-01-29 17:01:43.308137089 +0000 UTC m=+2386.216856361" watchObservedRunningTime="2026-01-29 17:01:43.316243583 +0000 UTC m=+2386.224962855" Jan 29 17:01:43 crc kubenswrapper[4886]: I0129 17:01:43.330764 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" podStartSLOduration=5.661298901 podStartE2EDuration="1m21.330741622s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:25.839312729 +0000 UTC m=+2308.748032011" lastFinishedPulling="2026-01-29 17:01:41.50875546 +0000 UTC m=+2384.417474732" observedRunningTime="2026-01-29 17:01:43.323316237 +0000 UTC m=+2386.232035509" watchObservedRunningTime="2026-01-29 17:01:43.330741622 +0000 UTC m=+2386.239460904" Jan 29 17:01:49 crc kubenswrapper[4886]: I0129 17:01:49.349749 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" event={"ID":"d01e417c-a1b0-445d-83eb-f3c21a492138","Type":"ContainerStarted","Data":"e82b816aa22fa7cb8c8087a66fb3102fb0562fb86926c76b7385ee50136b1363"} Jan 29 17:01:49 crc kubenswrapper[4886]: I0129 17:01:49.350514 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" Jan 29 17:01:49 crc kubenswrapper[4886]: I0129 17:01:49.367279 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" podStartSLOduration=3.3532462 podStartE2EDuration="1m28.36725026s" podCreationTimestamp="2026-01-29 17:00:21 +0000 UTC" firstStartedPulling="2026-01-29 17:00:23.915815413 +0000 UTC m=+2306.824534685" lastFinishedPulling="2026-01-29 17:01:48.929819463 +0000 UTC m=+2391.838538745" observedRunningTime="2026-01-29 17:01:49.361744649 +0000 UTC m=+2392.270463931" watchObservedRunningTime="2026-01-29 17:01:49.36725026 +0000 UTC m=+2392.275969552" Jan 29 17:01:50 crc kubenswrapper[4886]: I0129 17:01:50.615237 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:01:50 crc kubenswrapper[4886]: E0129 17:01:50.615639 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:01:52 crc kubenswrapper[4886]: I0129 17:01:52.660097 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-kwr4n" Jan 29 17:01:52 crc kubenswrapper[4886]: I0129 17:01:52.678292 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-zpgq2" Jan 29 17:01:52 crc kubenswrapper[4886]: I0129 17:01:52.717105 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-c4j5s" Jan 29 17:01:52 crc kubenswrapper[4886]: I0129 17:01:52.751542 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-9zqmc" Jan 29 17:01:52 crc kubenswrapper[4886]: I0129 17:01:52.793977 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-dxcgn" Jan 29 17:01:52 crc kubenswrapper[4886]: I0129 17:01:52.830393 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8gq2g" Jan 29 17:01:53 crc kubenswrapper[4886]: I0129 17:01:53.310034 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-75495fd598-2hpj4" Jan 29 17:01:53 crc kubenswrapper[4886]: I0129 17:01:53.345406 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-hf95f" Jan 29 17:01:54 crc kubenswrapper[4886]: I0129 17:01:54.372937 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-t5n28" Jan 29 17:01:54 crc kubenswrapper[4886]: I0129 17:01:54.419682 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh" Jan 29 17:01:55 crc kubenswrapper[4886]: I0129 17:01:55.408258 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" event={"ID":"165231a4-c627-484b-9aab-b4ce3feafe7e","Type":"ContainerStarted","Data":"b52f785a280bd9a7fef88e5f2e155831a76530296552cee1aafe344c231a6f35"} Jan 29 17:01:55 crc kubenswrapper[4886]: I0129 17:01:55.439418 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ffdr9" podStartSLOduration=4.780321854 podStartE2EDuration="1m33.439398421s" podCreationTimestamp="2026-01-29 17:00:22 +0000 UTC" firstStartedPulling="2026-01-29 17:00:26.063449959 +0000 UTC m=+2308.972169231" lastFinishedPulling="2026-01-29 17:01:54.722526526 +0000 UTC m=+2397.631245798" observedRunningTime="2026-01-29 17:01:55.430084504 +0000 UTC m=+2398.338803816" watchObservedRunningTime="2026-01-29 17:01:55.439398421 +0000 UTC m=+2398.348117693" Jan 29 17:02:01 crc kubenswrapper[4886]: I0129 17:02:01.615414 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:02:01 crc kubenswrapper[4886]: E0129 17:02:01.616544 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:02:02 crc kubenswrapper[4886]: I0129 17:02:02.432258 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-rhxnz" Jan 29 17:02:15 crc kubenswrapper[4886]: I0129 17:02:15.615229 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:02:15 crc kubenswrapper[4886]: E0129 17:02:15.616082 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:02:20 crc kubenswrapper[4886]: I0129 17:02:20.942517 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pmcr7"] Jan 29 17:02:20 crc kubenswrapper[4886]: I0129 17:02:20.945692 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:02:20 crc kubenswrapper[4886]: I0129 17:02:20.948432 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 29 17:02:20 crc kubenswrapper[4886]: I0129 17:02:20.949357 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 29 17:02:20 crc kubenswrapper[4886]: I0129 17:02:20.949617 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 29 17:02:20 crc kubenswrapper[4886]: I0129 17:02:20.952916 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rvlpg" Jan 29 17:02:20 crc kubenswrapper[4886]: I0129 17:02:20.959060 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pmcr7"] Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.030408 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cgwx"] Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.032623 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.035604 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.039044 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cgwx"] Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.124787 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7jjt\" (UniqueName: \"kubernetes.io/projected/2f1c4419-6120-44b9-853c-7a42391db3e7-kube-api-access-q7jjt\") pod \"dnsmasq-dns-675f4bcbfc-pmcr7\" (UID: \"2f1c4419-6120-44b9-853c-7a42391db3e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.124846 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhfqx\" (UniqueName: \"kubernetes.io/projected/204a721b-36ee-4631-8358-f2511f332249-kube-api-access-lhfqx\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.124883 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1c4419-6120-44b9-853c-7a42391db3e7-config\") pod \"dnsmasq-dns-675f4bcbfc-pmcr7\" (UID: \"2f1c4419-6120-44b9-853c-7a42391db3e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.124980 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.125005 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-config\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.226688 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.226733 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-config\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.226818 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7jjt\" (UniqueName: \"kubernetes.io/projected/2f1c4419-6120-44b9-853c-7a42391db3e7-kube-api-access-q7jjt\") pod \"dnsmasq-dns-675f4bcbfc-pmcr7\" (UID: \"2f1c4419-6120-44b9-853c-7a42391db3e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.226840 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhfqx\" (UniqueName: \"kubernetes.io/projected/204a721b-36ee-4631-8358-f2511f332249-kube-api-access-lhfqx\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.226859 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1c4419-6120-44b9-853c-7a42391db3e7-config\") pod \"dnsmasq-dns-675f4bcbfc-pmcr7\" (UID: \"2f1c4419-6120-44b9-853c-7a42391db3e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.227769 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1c4419-6120-44b9-853c-7a42391db3e7-config\") pod \"dnsmasq-dns-675f4bcbfc-pmcr7\" (UID: \"2f1c4419-6120-44b9-853c-7a42391db3e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.227788 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-config\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.227819 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.247739 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7jjt\" (UniqueName: \"kubernetes.io/projected/2f1c4419-6120-44b9-853c-7a42391db3e7-kube-api-access-q7jjt\") pod \"dnsmasq-dns-675f4bcbfc-pmcr7\" (UID: \"2f1c4419-6120-44b9-853c-7a42391db3e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.252147 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhfqx\" (UniqueName: \"kubernetes.io/projected/204a721b-36ee-4631-8358-f2511f332249-kube-api-access-lhfqx\") pod \"dnsmasq-dns-78dd6ddcc-4cgwx\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.265940 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.354315 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.775052 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pmcr7"] Jan 29 17:02:21 crc kubenswrapper[4886]: W0129 17:02:21.785858 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1c4419_6120_44b9_853c_7a42391db3e7.slice/crio-617c1fe920842500bf22662dbcff00fb4394c8a8a4577281f837a4ae20881073 WatchSource:0}: Error finding container 617c1fe920842500bf22662dbcff00fb4394c8a8a4577281f837a4ae20881073: Status 404 returned error can't find the container with id 617c1fe920842500bf22662dbcff00fb4394c8a8a4577281f837a4ae20881073 Jan 29 17:02:21 crc kubenswrapper[4886]: I0129 17:02:21.866371 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cgwx"] Jan 29 17:02:21 crc kubenswrapper[4886]: W0129 17:02:21.875096 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204a721b_36ee_4631_8358_f2511f332249.slice/crio-b0ce5d271c3a87e35c87ccbefa1e0c1a96ac0ecd541d22ead6b84099a6bd1679 WatchSource:0}: Error finding container b0ce5d271c3a87e35c87ccbefa1e0c1a96ac0ecd541d22ead6b84099a6bd1679: Status 404 returned error can't find the container with id b0ce5d271c3a87e35c87ccbefa1e0c1a96ac0ecd541d22ead6b84099a6bd1679 Jan 29 17:02:22 crc kubenswrapper[4886]: I0129 17:02:22.664336 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" event={"ID":"2f1c4419-6120-44b9-853c-7a42391db3e7","Type":"ContainerStarted","Data":"617c1fe920842500bf22662dbcff00fb4394c8a8a4577281f837a4ae20881073"} Jan 29 17:02:22 crc kubenswrapper[4886]: I0129 17:02:22.666235 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" event={"ID":"204a721b-36ee-4631-8358-f2511f332249","Type":"ContainerStarted","Data":"b0ce5d271c3a87e35c87ccbefa1e0c1a96ac0ecd541d22ead6b84099a6bd1679"} Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.734053 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pmcr7"] Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.772730 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tn5pt"] Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.774709 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.785930 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tn5pt"] Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.888515 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.888607 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-config\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.888680 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zcd\" (UniqueName: \"kubernetes.io/projected/3748c627-3deb-4b89-acd3-2269f42ba343-kube-api-access-x6zcd\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.990786 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zcd\" (UniqueName: \"kubernetes.io/projected/3748c627-3deb-4b89-acd3-2269f42ba343-kube-api-access-x6zcd\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.990973 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.991042 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-config\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.992123 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-config\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:23 crc kubenswrapper[4886]: I0129 17:02:23.992271 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.015070 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zcd\" (UniqueName: \"kubernetes.io/projected/3748c627-3deb-4b89-acd3-2269f42ba343-kube-api-access-x6zcd\") pod \"dnsmasq-dns-666b6646f7-tn5pt\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.122317 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.143595 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cgwx"] Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.164281 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bqbqx"] Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.166021 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.180262 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bqbqx"] Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.310661 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb44s\" (UniqueName: \"kubernetes.io/projected/6508ccc6-d71f-449d-bbe1-83270d005815-kube-api-access-kb44s\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.310940 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.311221 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-config\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.412512 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-config\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.412625 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44s\" (UniqueName: \"kubernetes.io/projected/6508ccc6-d71f-449d-bbe1-83270d005815-kube-api-access-kb44s\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.412661 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.413690 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-config\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.414085 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.444066 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb44s\" (UniqueName: \"kubernetes.io/projected/6508ccc6-d71f-449d-bbe1-83270d005815-kube-api-access-kb44s\") pod \"dnsmasq-dns-57d769cc4f-bqbqx\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.569641 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.733010 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tn5pt"] Jan 29 17:02:24 crc kubenswrapper[4886]: W0129 17:02:24.741681 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3748c627_3deb_4b89_acd3_2269f42ba343.slice/crio-5ab6a774b30c4926836ad5d20a9d8ca3a61ba5556b7b5bbd72dc9a90a6ac1502 WatchSource:0}: Error finding container 5ab6a774b30c4926836ad5d20a9d8ca3a61ba5556b7b5bbd72dc9a90a6ac1502: Status 404 returned error can't find the container with id 5ab6a774b30c4926836ad5d20a9d8ca3a61ba5556b7b5bbd72dc9a90a6ac1502 Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.902704 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.904762 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.944077 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wvnrk" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.945389 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.945947 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.946143 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.946618 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.959619 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.962732 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.981869 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 17:02:24 crc kubenswrapper[4886]: I0129 17:02:24.995462 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:24.998376 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.004704 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.008021 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.013836 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.023897 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.047779 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.047833 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.047859 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.047972 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b0be43b-8956-45aa-ad50-de9183b3fea3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.048047 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.048081 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.048229 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.048276 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ae4636fd-e9b4-4ea8-ae5f-484166bf5cbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae4636fd-e9b4-4ea8-ae5f-484166bf5cbc\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.048499 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpbz9\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-kube-api-access-vpbz9\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.048606 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.048672 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b0be43b-8956-45aa-ad50-de9183b3fea3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.099182 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bqbqx"] Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.150909 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151213 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qmm\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-kube-api-access-67qmm\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151269 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151299 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151342 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151363 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ae4636fd-e9b4-4ea8-ae5f-484166bf5cbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae4636fd-e9b4-4ea8-ae5f-484166bf5cbc\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151382 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-server-conf\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151397 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151434 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-config-data\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151456 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151478 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151515 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpbz9\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-kube-api-access-vpbz9\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151546 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-863286b1-f8a7-473e-bfad-effd8e0e46c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-863286b1-f8a7-473e-bfad-effd8e0e46c7\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151582 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151602 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b0be43b-8956-45aa-ad50-de9183b3fea3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151620 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151653 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151684 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv64g\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-kube-api-access-hv64g\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151706 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151746 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151762 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-config-data\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151778 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151811 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-server-conf\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151832 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151847 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151863 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ffb99285-fad5-4b64-a7c1-8c79996a97a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ffb99285-fad5-4b64-a7c1-8c79996a97a0\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151920 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151943 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/842bfe4d-04ba-4143-9076-3033163c7b82-pod-info\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.151986 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b0be43b-8956-45aa-ad50-de9183b3fea3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.152010 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/842bfe4d-04ba-4143-9076-3033163c7b82-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.152031 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-pod-info\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.152077 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.152100 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.154128 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.154158 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.154768 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.155203 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b0be43b-8956-45aa-ad50-de9183b3fea3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.155290 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.159250 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b0be43b-8956-45aa-ad50-de9183b3fea3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.159611 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b0be43b-8956-45aa-ad50-de9183b3fea3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.159629 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.160500 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ae4636fd-e9b4-4ea8-ae5f-484166bf5cbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae4636fd-e9b4-4ea8-ae5f-484166bf5cbc\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d213695ed3765abf3a041dd1be7937f5b64f87e22fac48d2c805fc17dc0e08a3/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.161190 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.161570 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.177266 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpbz9\" (UniqueName: \"kubernetes.io/projected/2b0be43b-8956-45aa-ad50-de9183b3fea3-kube-api-access-vpbz9\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.217511 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ae4636fd-e9b4-4ea8-ae5f-484166bf5cbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae4636fd-e9b4-4ea8-ae5f-484166bf5cbc\") pod \"rabbitmq-server-0\" (UID: \"2b0be43b-8956-45aa-ad50-de9183b3fea3\") " pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256009 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256064 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256088 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-config-data\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256106 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-server-conf\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256130 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ffb99285-fad5-4b64-a7c1-8c79996a97a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ffb99285-fad5-4b64-a7c1-8c79996a97a0\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256150 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256167 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/842bfe4d-04ba-4143-9076-3033163c7b82-pod-info\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256190 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/842bfe4d-04ba-4143-9076-3033163c7b82-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256204 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-pod-info\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256227 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256259 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67qmm\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-kube-api-access-67qmm\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256283 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256308 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256411 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-server-conf\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256429 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256453 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-config-data\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256475 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256498 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256535 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-863286b1-f8a7-473e-bfad-effd8e0e46c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-863286b1-f8a7-473e-bfad-effd8e0e46c7\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256557 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256576 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.256611 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv64g\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-kube-api-access-hv64g\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.257337 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.258460 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-server-conf\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.258810 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-config-data\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.259147 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.259175 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.264815 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.265558 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-server-conf\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.265935 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/842bfe4d-04ba-4143-9076-3033163c7b82-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.266506 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/842bfe4d-04ba-4143-9076-3033163c7b82-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.267473 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.268380 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.271916 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ffb99285-fad5-4b64-a7c1-8c79996a97a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ffb99285-fad5-4b64-a7c1-8c79996a97a0\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/326093cb55c704f9a2105b595679c793cb8447479f9731f8a7fd148174243d7a/globalmount\"" pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.269269 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.269569 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.270230 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-config-data\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.271644 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.269466 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/842bfe4d-04ba-4143-9076-3033163c7b82-pod-info\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.280122 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.280275 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-863286b1-f8a7-473e-bfad-effd8e0e46c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-863286b1-f8a7-473e-bfad-effd8e0e46c7\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b05e7df932d194e194076bc038f6db5e1e433307caecab672c694750eca73b77/globalmount\"" pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.281323 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-pod-info\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.281946 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.287068 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.291030 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.296085 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.298574 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.298647 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.298708 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.298871 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.299113 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pch54" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.299180 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.309441 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qmm\" (UniqueName: \"kubernetes.io/projected/49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10-kube-api-access-67qmm\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.313383 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.313539 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.328764 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv64g\" (UniqueName: \"kubernetes.io/projected/842bfe4d-04ba-4143-9076-3033163c7b82-kube-api-access-hv64g\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.336459 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364509 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364593 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-12dabd5a-7f4d-4d12-a40b-12125ccd9878\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12dabd5a-7f4d-4d12-a40b-12125ccd9878\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364616 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364661 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364690 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364742 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364759 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9d0db9ae-746b-419a-bc61-bf85645d2bff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9d0db9ae-746b-419a-bc61-bf85645d2bff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364816 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpbmf\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-kube-api-access-bpbmf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364838 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.364878 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.391392 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ffb99285-fad5-4b64-a7c1-8c79996a97a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ffb99285-fad5-4b64-a7c1-8c79996a97a0\") pod \"rabbitmq-server-2\" (UID: \"842bfe4d-04ba-4143-9076-3033163c7b82\") " pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.432830 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-863286b1-f8a7-473e-bfad-effd8e0e46c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-863286b1-f8a7-473e-bfad-effd8e0e46c7\") pod \"rabbitmq-server-1\" (UID: \"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10\") " pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.442746 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466568 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466653 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-12dabd5a-7f4d-4d12-a40b-12125ccd9878\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12dabd5a-7f4d-4d12-a40b-12125ccd9878\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466679 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466713 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466746 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466789 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466812 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9d0db9ae-746b-419a-bc61-bf85645d2bff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466837 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9d0db9ae-746b-419a-bc61-bf85645d2bff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466870 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpbmf\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-kube-api-access-bpbmf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466899 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.466935 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.467844 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.468158 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.471962 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.472816 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.473143 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9d0db9ae-746b-419a-bc61-bf85645d2bff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.481571 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9d0db9ae-746b-419a-bc61-bf85645d2bff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.484115 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.486233 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.489979 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9d0db9ae-746b-419a-bc61-bf85645d2bff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.498626 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpbmf\" (UniqueName: \"kubernetes.io/projected/9d0db9ae-746b-419a-bc61-bf85645d2bff-kube-api-access-bpbmf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.636742 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.666291 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.666341 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-12dabd5a-7f4d-4d12-a40b-12125ccd9878\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12dabd5a-7f4d-4d12-a40b-12125ccd9878\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bee345f7e070967b6cc29d6dbc72d8fe7f7c7012e7f3befd39c45a65d0513986/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.725569 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" event={"ID":"6508ccc6-d71f-449d-bbe1-83270d005815","Type":"ContainerStarted","Data":"3cb5dbf55000d2d62fd9df0707aa0b2ae3790c985165faca182a19e1e38e6908"} Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.755606 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-12dabd5a-7f4d-4d12-a40b-12125ccd9878\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12dabd5a-7f4d-4d12-a40b-12125ccd9878\") pod \"rabbitmq-cell1-server-0\" (UID: \"9d0db9ae-746b-419a-bc61-bf85645d2bff\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.768937 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" event={"ID":"3748c627-3deb-4b89-acd3-2269f42ba343","Type":"ContainerStarted","Data":"5ab6a774b30c4926836ad5d20a9d8ca3a61ba5556b7b5bbd72dc9a90a6ac1502"} Jan 29 17:02:25 crc kubenswrapper[4886]: I0129 17:02:25.965203 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.127768 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.211561 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.399065 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 29 17:02:26 crc kubenswrapper[4886]: W0129 17:02:26.456049 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842bfe4d_04ba_4143_9076_3033163c7b82.slice/crio-a37b33399d781fa177e976ceeb1b5940ed29651715b90f0db3dbe52f088dc68f WatchSource:0}: Error finding container a37b33399d781fa177e976ceeb1b5940ed29651715b90f0db3dbe52f088dc68f: Status 404 returned error can't find the container with id a37b33399d781fa177e976ceeb1b5940ed29651715b90f0db3dbe52f088dc68f Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.477607 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.480580 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.490138 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.492829 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.493031 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-jhmnh" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.493148 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.494111 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.543704 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.614804 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:02:26 crc kubenswrapper[4886]: E0129 17:02:26.615089 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.623453 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.623507 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98bed306-aa68-4e53-affc-e04497079ccb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.623706 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c95b82a4-c681-4c74-b958-f29b26ce56ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c95b82a4-c681-4c74-b958-f29b26ce56ea\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.623896 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bed306-aa68-4e53-affc-e04497079ccb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.623964 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-kolla-config\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.624100 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bed306-aa68-4e53-affc-e04497079ccb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.624823 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mz6\" (UniqueName: \"kubernetes.io/projected/98bed306-aa68-4e53-affc-e04497079ccb-kube-api-access-x2mz6\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.625113 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-config-data-default\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.728537 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98bed306-aa68-4e53-affc-e04497079ccb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.729059 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c95b82a4-c681-4c74-b958-f29b26ce56ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c95b82a4-c681-4c74-b958-f29b26ce56ea\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.729225 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98bed306-aa68-4e53-affc-e04497079ccb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.729244 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bed306-aa68-4e53-affc-e04497079ccb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.730515 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-kolla-config\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.730626 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bed306-aa68-4e53-affc-e04497079ccb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.730651 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mz6\" (UniqueName: \"kubernetes.io/projected/98bed306-aa68-4e53-affc-e04497079ccb-kube-api-access-x2mz6\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.730869 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-config-data-default\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.730940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.733436 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-config-data-default\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.733704 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.735640 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.737578 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.737815 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c95b82a4-c681-4c74-b958-f29b26ce56ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c95b82a4-c681-4c74-b958-f29b26ce56ea\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3752126cfed518dabb57802d31fe1f9ab6a18ac412e8a3d2f0a6cf445251bd07/globalmount\"" pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.738998 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98bed306-aa68-4e53-affc-e04497079ccb-kolla-config\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.742124 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bed306-aa68-4e53-affc-e04497079ccb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.746138 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bed306-aa68-4e53-affc-e04497079ccb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.751922 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mz6\" (UniqueName: \"kubernetes.io/projected/98bed306-aa68-4e53-affc-e04497079ccb-kube-api-access-x2mz6\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.795085 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c95b82a4-c681-4c74-b958-f29b26ce56ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c95b82a4-c681-4c74-b958-f29b26ce56ea\") pod \"openstack-galera-0\" (UID: \"98bed306-aa68-4e53-affc-e04497079ccb\") " pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.822306 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.870536 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"842bfe4d-04ba-4143-9076-3033163c7b82","Type":"ContainerStarted","Data":"a37b33399d781fa177e976ceeb1b5940ed29651715b90f0db3dbe52f088dc68f"} Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.874648 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10","Type":"ContainerStarted","Data":"f8f1b5546a85023fcb8e48d8f18ea19083d41ec9d738804c59ee6271fe642723"} Jan 29 17:02:26 crc kubenswrapper[4886]: I0129 17:02:26.890890 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b0be43b-8956-45aa-ad50-de9183b3fea3","Type":"ContainerStarted","Data":"3b52df94d505c7f7b34cd527062caeb6a596ff835c3122d7c780516aec2c0f6d"} Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.458295 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 17:02:27 crc kubenswrapper[4886]: W0129 17:02:27.520485 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98bed306_aa68_4e53_affc_e04497079ccb.slice/crio-b56dd68fc17b84a407ba9baf75650d619ea6c98198893b53c62470f66159797d WatchSource:0}: Error finding container b56dd68fc17b84a407ba9baf75650d619ea6c98198893b53c62470f66159797d: Status 404 returned error can't find the container with id b56dd68fc17b84a407ba9baf75650d619ea6c98198893b53c62470f66159797d Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.818434 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.820022 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.829767 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gp68l" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.830015 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.830131 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.830241 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.840930 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.925293 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9d0db9ae-746b-419a-bc61-bf85645d2bff","Type":"ContainerStarted","Data":"8e3c7aa1c69a329a7427b4ac8e75a6ba30bf1c14cd9bec54b7145d363fed3093"} Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.942307 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"98bed306-aa68-4e53-affc-e04497079ccb","Type":"ContainerStarted","Data":"b56dd68fc17b84a407ba9baf75650d619ea6c98198893b53c62470f66159797d"} Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.971501 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7khl\" (UniqueName: \"kubernetes.io/projected/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-kube-api-access-k7khl\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.971563 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0e4b33e1-211c-4727-b145-8a8e2e359423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e4b33e1-211c-4727-b145-8a8e2e359423\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.973879 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.973948 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.974015 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.974078 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.974343 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:27 crc kubenswrapper[4886]: I0129 17:02:27.974415 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077093 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077180 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077227 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7khl\" (UniqueName: \"kubernetes.io/projected/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-kube-api-access-k7khl\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077258 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0e4b33e1-211c-4727-b145-8a8e2e359423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e4b33e1-211c-4727-b145-8a8e2e359423\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077318 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077364 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077409 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077461 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.077674 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.078052 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.080201 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.081820 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.093309 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.093314 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.093419 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0e4b33e1-211c-4727-b145-8a8e2e359423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e4b33e1-211c-4727-b145-8a8e2e359423\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4ea550bbf1bf2f4ac54a6894dfc3a6d7f2959dcdb917de414b494340871d563d/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.093537 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.096233 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7khl\" (UniqueName: \"kubernetes.io/projected/954d7d1e-fd92-4c83-87d8-87a1f866dbbe-kube-api-access-k7khl\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.154906 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.156184 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.159841 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-m5568" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.159998 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.160084 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.162437 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0e4b33e1-211c-4727-b145-8a8e2e359423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e4b33e1-211c-4727-b145-8a8e2e359423\") pod \"openstack-cell1-galera-0\" (UID: \"954d7d1e-fd92-4c83-87d8-87a1f866dbbe\") " pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.194585 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.286481 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c8ef15-a2b1-41df-8048-752b56d26653-memcached-tls-certs\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.286662 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88c8ef15-a2b1-41df-8048-752b56d26653-config-data\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.286945 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c8ef15-a2b1-41df-8048-752b56d26653-combined-ca-bundle\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.287044 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88c8ef15-a2b1-41df-8048-752b56d26653-kolla-config\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.287070 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vq5l\" (UniqueName: \"kubernetes.io/projected/88c8ef15-a2b1-41df-8048-752b56d26653-kube-api-access-4vq5l\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.388959 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88c8ef15-a2b1-41df-8048-752b56d26653-kolla-config\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.389027 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vq5l\" (UniqueName: \"kubernetes.io/projected/88c8ef15-a2b1-41df-8048-752b56d26653-kube-api-access-4vq5l\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.389082 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c8ef15-a2b1-41df-8048-752b56d26653-memcached-tls-certs\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.389168 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88c8ef15-a2b1-41df-8048-752b56d26653-config-data\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.389352 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c8ef15-a2b1-41df-8048-752b56d26653-combined-ca-bundle\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.390554 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88c8ef15-a2b1-41df-8048-752b56d26653-kolla-config\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.391306 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88c8ef15-a2b1-41df-8048-752b56d26653-config-data\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.394921 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c8ef15-a2b1-41df-8048-752b56d26653-combined-ca-bundle\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.416174 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vq5l\" (UniqueName: \"kubernetes.io/projected/88c8ef15-a2b1-41df-8048-752b56d26653-kube-api-access-4vq5l\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.418927 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c8ef15-a2b1-41df-8048-752b56d26653-memcached-tls-certs\") pod \"memcached-0\" (UID: \"88c8ef15-a2b1-41df-8048-752b56d26653\") " pod="openstack/memcached-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.458608 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 17:02:28 crc kubenswrapper[4886]: I0129 17:02:28.494849 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 17:02:29 crc kubenswrapper[4886]: I0129 17:02:29.091108 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 29 17:02:29 crc kubenswrapper[4886]: I0129 17:02:29.385461 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.032731 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"88c8ef15-a2b1-41df-8048-752b56d26653","Type":"ContainerStarted","Data":"1a197767c7bcdfe8876ec470e270c663a1a0267890c843f41fe09eab1488fbab"} Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.035372 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"954d7d1e-fd92-4c83-87d8-87a1f866dbbe","Type":"ContainerStarted","Data":"5f8cdea6298d66d3f2be7ec07d09f99ef9e582a064f6a58e14fe6629079ba303"} Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.698345 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.700239 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.704567 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cn78q" Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.731191 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.767297 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrp8r\" (UniqueName: \"kubernetes.io/projected/dba0c99a-0f14-42bd-8822-ee79fc73ee41-kube-api-access-xrp8r\") pod \"kube-state-metrics-0\" (UID: \"dba0c99a-0f14-42bd-8822-ee79fc73ee41\") " pod="openstack/kube-state-metrics-0" Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.874040 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrp8r\" (UniqueName: \"kubernetes.io/projected/dba0c99a-0f14-42bd-8822-ee79fc73ee41-kube-api-access-xrp8r\") pod \"kube-state-metrics-0\" (UID: \"dba0c99a-0f14-42bd-8822-ee79fc73ee41\") " pod="openstack/kube-state-metrics-0" Jan 29 17:02:30 crc kubenswrapper[4886]: I0129 17:02:30.908489 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrp8r\" (UniqueName: \"kubernetes.io/projected/dba0c99a-0f14-42bd-8822-ee79fc73ee41-kube-api-access-xrp8r\") pod \"kube-state-metrics-0\" (UID: \"dba0c99a-0f14-42bd-8822-ee79fc73ee41\") " pod="openstack/kube-state-metrics-0" Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.032972 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.561809 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c"] Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.563550 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.575782 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.576084 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-2xk9f" Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.588895 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c"] Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.698926 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1da890-a690-46b4-95aa-3f282b3cdc30-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-ld46c\" (UID: \"ee1da890-a690-46b4-95aa-3f282b3cdc30\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.698997 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmcn5\" (UniqueName: \"kubernetes.io/projected/ee1da890-a690-46b4-95aa-3f282b3cdc30-kube-api-access-bmcn5\") pod \"observability-ui-dashboards-66cbf594b5-ld46c\" (UID: \"ee1da890-a690-46b4-95aa-3f282b3cdc30\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.800634 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1da890-a690-46b4-95aa-3f282b3cdc30-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-ld46c\" (UID: \"ee1da890-a690-46b4-95aa-3f282b3cdc30\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.800723 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmcn5\" (UniqueName: \"kubernetes.io/projected/ee1da890-a690-46b4-95aa-3f282b3cdc30-kube-api-access-bmcn5\") pod \"observability-ui-dashboards-66cbf594b5-ld46c\" (UID: \"ee1da890-a690-46b4-95aa-3f282b3cdc30\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:31 crc kubenswrapper[4886]: E0129 17:02:31.800818 4886 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Jan 29 17:02:31 crc kubenswrapper[4886]: E0129 17:02:31.800903 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee1da890-a690-46b4-95aa-3f282b3cdc30-serving-cert podName:ee1da890-a690-46b4-95aa-3f282b3cdc30 nodeName:}" failed. No retries permitted until 2026-01-29 17:02:32.300885685 +0000 UTC m=+2435.209604957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee1da890-a690-46b4-95aa-3f282b3cdc30-serving-cert") pod "observability-ui-dashboards-66cbf594b5-ld46c" (UID: "ee1da890-a690-46b4-95aa-3f282b3cdc30") : secret "observability-ui-dashboards" not found Jan 29 17:02:31 crc kubenswrapper[4886]: I0129 17:02:31.846471 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmcn5\" (UniqueName: \"kubernetes.io/projected/ee1da890-a690-46b4-95aa-3f282b3cdc30-kube-api-access-bmcn5\") pod \"observability-ui-dashboards-66cbf594b5-ld46c\" (UID: \"ee1da890-a690-46b4-95aa-3f282b3cdc30\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.012184 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-69c97cc7f-npplt"] Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.030740 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.059375 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69c97cc7f-npplt"] Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.091536 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.113453 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-trusted-ca-bundle\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.113499 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-oauth-serving-cert\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.113539 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-oauth-config\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.113678 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w979x\" (UniqueName: \"kubernetes.io/projected/57bcd464-9c19-451c-b1e7-ec31c75da5dd-kube-api-access-w979x\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.113737 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-config\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.113764 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-service-ca\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.113827 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-serving-cert\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.128801 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.145757 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.145949 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.146090 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.146206 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.146709 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.147463 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.147566 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.147657 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.150528 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gbmnx" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217510 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce7955a1-eb58-425a-872a-7ec102b8e090-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217567 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w979x\" (UniqueName: \"kubernetes.io/projected/57bcd464-9c19-451c-b1e7-ec31c75da5dd-kube-api-access-w979x\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217606 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-config\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217634 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-service-ca\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217679 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-serving-cert\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217702 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217741 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217764 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217796 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217820 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217852 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217919 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-trusted-ca-bundle\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217937 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-oauth-serving-cert\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217962 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2cnt\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-kube-api-access-w2cnt\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217980 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-oauth-config\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.217998 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.218027 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.219276 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-config\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.219683 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-service-ca\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.220178 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-trusted-ca-bundle\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.220761 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57bcd464-9c19-451c-b1e7-ec31c75da5dd-oauth-serving-cert\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.230429 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-serving-cert\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.247599 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57bcd464-9c19-451c-b1e7-ec31c75da5dd-console-oauth-config\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.251038 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w979x\" (UniqueName: \"kubernetes.io/projected/57bcd464-9c19-451c-b1e7-ec31c75da5dd-kube-api-access-w979x\") pod \"console-69c97cc7f-npplt\" (UID: \"57bcd464-9c19-451c-b1e7-ec31c75da5dd\") " pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319366 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319420 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319451 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319492 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1da890-a690-46b4-95aa-3f282b3cdc30-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-ld46c\" (UID: \"ee1da890-a690-46b4-95aa-3f282b3cdc30\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319540 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2cnt\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-kube-api-access-w2cnt\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319558 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319584 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319614 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce7955a1-eb58-425a-872a-7ec102b8e090-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319682 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319711 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.319732 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.320508 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.324755 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.324824 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5b5b0b1c62be5d324bfe10f676e08a70a611b72b2c99a9227275ea9ec17aa7e0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.326583 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce7955a1-eb58-425a-872a-7ec102b8e090-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.327641 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.327891 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.331459 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1da890-a690-46b4-95aa-3f282b3cdc30-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-ld46c\" (UID: \"ee1da890-a690-46b4-95aa-3f282b3cdc30\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.331698 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.334169 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.336021 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.337903 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.348409 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2cnt\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-kube-api-access-w2cnt\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.383020 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.383544 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"prometheus-metric-storage-0\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.462063 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 17:02:32 crc kubenswrapper[4886]: I0129 17:02:32.511481 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.369820 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b7d9p"] Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.371217 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.375427 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.380349 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.380590 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xd2tq" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.390211 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b7d9p"] Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.445566 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-xhds2"] Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.453082 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.462200 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xhds2"] Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.463479 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544b4515-481c-47f1-acb6-ed332a3497d4-combined-ca-bundle\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.463538 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wkw\" (UniqueName: \"kubernetes.io/projected/544b4515-481c-47f1-acb6-ed332a3497d4-kube-api-access-p7wkw\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.463569 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-log-ovn\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.463597 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/544b4515-481c-47f1-acb6-ed332a3497d4-ovn-controller-tls-certs\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.463642 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-run-ovn\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.463701 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/544b4515-481c-47f1-acb6-ed332a3497d4-scripts\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.463758 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-run\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565471 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/544b4515-481c-47f1-acb6-ed332a3497d4-scripts\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565534 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-etc-ovs\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565568 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdcb4\" (UniqueName: \"kubernetes.io/projected/03dc141f-69cc-4cb4-af0b-acf85642b86e-kube-api-access-rdcb4\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565625 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-run\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565641 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03dc141f-69cc-4cb4-af0b-acf85642b86e-scripts\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565678 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-log\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565713 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-run\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565754 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544b4515-481c-47f1-acb6-ed332a3497d4-combined-ca-bundle\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565777 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wkw\" (UniqueName: \"kubernetes.io/projected/544b4515-481c-47f1-acb6-ed332a3497d4-kube-api-access-p7wkw\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565795 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-log-ovn\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565811 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/544b4515-481c-47f1-acb6-ed332a3497d4-ovn-controller-tls-certs\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565836 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-run-ovn\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.565878 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-lib\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.593478 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544b4515-481c-47f1-acb6-ed332a3497d4-combined-ca-bundle\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.598895 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/544b4515-481c-47f1-acb6-ed332a3497d4-ovn-controller-tls-certs\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.624864 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wkw\" (UniqueName: \"kubernetes.io/projected/544b4515-481c-47f1-acb6-ed332a3497d4-kube-api-access-p7wkw\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.671403 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-lib\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.671465 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-etc-ovs\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.671508 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdcb4\" (UniqueName: \"kubernetes.io/projected/03dc141f-69cc-4cb4-af0b-acf85642b86e-kube-api-access-rdcb4\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.671563 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03dc141f-69cc-4cb4-af0b-acf85642b86e-scripts\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.671597 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-log\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.671617 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-run\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.673284 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-etc-ovs\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.674245 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03dc141f-69cc-4cb4-af0b-acf85642b86e-scripts\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.706713 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdcb4\" (UniqueName: \"kubernetes.io/projected/03dc141f-69cc-4cb4-af0b-acf85642b86e-kube-api-access-rdcb4\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.957170 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/544b4515-481c-47f1-acb6-ed332a3497d4-scripts\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.957872 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-run\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.958020 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-log-ovn\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.958198 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/544b4515-481c-47f1-acb6-ed332a3497d4-var-run-ovn\") pod \"ovn-controller-b7d9p\" (UID: \"544b4515-481c-47f1-acb6-ed332a3497d4\") " pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.958535 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-run\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.958613 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-lib\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.958702 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03dc141f-69cc-4cb4-af0b-acf85642b86e-var-log\") pod \"ovn-controller-ovs-xhds2\" (UID: \"03dc141f-69cc-4cb4-af0b-acf85642b86e\") " pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:33 crc kubenswrapper[4886]: I0129 17:02:33.994851 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b7d9p" Jan 29 17:02:34 crc kubenswrapper[4886]: I0129 17:02:34.103105 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.256454 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.258696 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.261261 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zjp5g" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.267235 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.267429 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.267631 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.268126 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.274946 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.356636 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cfc2829b-4c70-4482-9f64-05fedd0caae9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfc2829b-4c70-4482-9f64-05fedd0caae9\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.356748 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39601bb5-f2bc-47a6-824a-609c207b963f-config\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.356827 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhx9\" (UniqueName: \"kubernetes.io/projected/39601bb5-f2bc-47a6-824a-609c207b963f-kube-api-access-5xhx9\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.356890 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39601bb5-f2bc-47a6-824a-609c207b963f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.357035 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.357067 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.357120 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/39601bb5-f2bc-47a6-824a-609c207b963f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.357225 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.459238 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.459304 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.459358 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/39601bb5-f2bc-47a6-824a-609c207b963f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.459436 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.459526 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cfc2829b-4c70-4482-9f64-05fedd0caae9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfc2829b-4c70-4482-9f64-05fedd0caae9\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.459582 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39601bb5-f2bc-47a6-824a-609c207b963f-config\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.459622 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xhx9\" (UniqueName: \"kubernetes.io/projected/39601bb5-f2bc-47a6-824a-609c207b963f-kube-api-access-5xhx9\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.459654 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39601bb5-f2bc-47a6-824a-609c207b963f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.460544 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/39601bb5-f2bc-47a6-824a-609c207b963f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.461070 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39601bb5-f2bc-47a6-824a-609c207b963f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.461360 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39601bb5-f2bc-47a6-824a-609c207b963f-config\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.463076 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.463114 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cfc2829b-4c70-4482-9f64-05fedd0caae9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfc2829b-4c70-4482-9f64-05fedd0caae9\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7cf9dd5b9a6bbdfffd591daeb645dac0dc01e8f7deb302127ed56fc967835337/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.464589 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.464921 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.465557 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/39601bb5-f2bc-47a6-824a-609c207b963f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.482908 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xhx9\" (UniqueName: \"kubernetes.io/projected/39601bb5-f2bc-47a6-824a-609c207b963f-kube-api-access-5xhx9\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.495742 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.497680 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.500230 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.500511 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.500806 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.500901 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-q9lrf" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.522937 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cfc2829b-4c70-4482-9f64-05fedd0caae9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfc2829b-4c70-4482-9f64-05fedd0caae9\") pod \"ovsdbserver-nb-0\" (UID: \"39601bb5-f2bc-47a6-824a-609c207b963f\") " pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.569055 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b015d0c-8672-450a-a079-965cc4ccd07f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.569462 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b015d0c-8672-450a-a079-965cc4ccd07f-config\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.569636 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b015d0c-8672-450a-a079-965cc4ccd07f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.569767 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9b6c9a9a-cc72-46ff-b530-2325a25d9ef0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b6c9a9a-cc72-46ff-b530-2325a25d9ef0\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.569963 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.570215 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.570351 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlzlz\" (UniqueName: \"kubernetes.io/projected/7b015d0c-8672-450a-a079-965cc4ccd07f-kube-api-access-vlzlz\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.570441 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.572120 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.586784 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.672277 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.672788 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.672857 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlzlz\" (UniqueName: \"kubernetes.io/projected/7b015d0c-8672-450a-a079-965cc4ccd07f-kube-api-access-vlzlz\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.672940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.673003 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b015d0c-8672-450a-a079-965cc4ccd07f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.673118 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b015d0c-8672-450a-a079-965cc4ccd07f-config\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.673263 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b015d0c-8672-450a-a079-965cc4ccd07f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.673297 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9b6c9a9a-cc72-46ff-b530-2325a25d9ef0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b6c9a9a-cc72-46ff-b530-2325a25d9ef0\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.674172 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b015d0c-8672-450a-a079-965cc4ccd07f-config\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.674791 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b015d0c-8672-450a-a079-965cc4ccd07f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.676123 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b015d0c-8672-450a-a079-965cc4ccd07f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.680304 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.680379 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9b6c9a9a-cc72-46ff-b530-2325a25d9ef0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b6c9a9a-cc72-46ff-b530-2325a25d9ef0\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/297149745560dc1f1ff1e411a84efac3cc898ea24d98a3f7b5a3d7276b7eb1e8/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.680577 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.680732 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.693740 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b015d0c-8672-450a-a079-965cc4ccd07f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.709574 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlzlz\" (UniqueName: \"kubernetes.io/projected/7b015d0c-8672-450a-a079-965cc4ccd07f-kube-api-access-vlzlz\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.724494 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9b6c9a9a-cc72-46ff-b530-2325a25d9ef0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b6c9a9a-cc72-46ff-b530-2325a25d9ef0\") pod \"ovsdbserver-sb-0\" (UID: \"7b015d0c-8672-450a-a079-965cc4ccd07f\") " pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:37 crc kubenswrapper[4886]: I0129 17:02:37.887428 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 17:02:41 crc kubenswrapper[4886]: I0129 17:02:41.617241 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:02:41 crc kubenswrapper[4886]: E0129 17:02:41.618246 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:02:56 crc kubenswrapper[4886]: I0129 17:02:56.615520 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:02:56 crc kubenswrapper[4886]: E0129 17:02:56.616711 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:03:11 crc kubenswrapper[4886]: I0129 17:03:11.615993 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:03:11 crc kubenswrapper[4886]: E0129 17:03:11.617660 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:03:14 crc kubenswrapper[4886]: E0129 17:03:14.655447 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 29 17:03:14 crc kubenswrapper[4886]: E0129 17:03:14.655999 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hv64g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-2_openstack(842bfe4d-04ba-4143-9076-3033163c7b82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:14 crc kubenswrapper[4886]: E0129 17:03:14.657234 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-2" podUID="842bfe4d-04ba-4143-9076-3033163c7b82" Jan 29 17:03:14 crc kubenswrapper[4886]: E0129 17:03:14.676562 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 29 17:03:14 crc kubenswrapper[4886]: E0129 17:03:14.676760 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67qmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-1_openstack(49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:14 crc kubenswrapper[4886]: E0129 17:03:14.678003 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-1" podUID="49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10" Jan 29 17:03:15 crc kubenswrapper[4886]: E0129 17:03:15.555559 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-1" podUID="49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10" Jan 29 17:03:15 crc kubenswrapper[4886]: E0129 17:03:15.555634 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-2" podUID="842bfe4d-04ba-4143-9076-3033163c7b82" Jan 29 17:03:20 crc kubenswrapper[4886]: E0129 17:03:20.489100 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 29 17:03:20 crc kubenswrapper[4886]: E0129 17:03:20.490290 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpbz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(2b0be43b-8956-45aa-ad50-de9183b3fea3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:20 crc kubenswrapper[4886]: E0129 17:03:20.492062 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="2b0be43b-8956-45aa-ad50-de9183b3fea3" Jan 29 17:03:20 crc kubenswrapper[4886]: E0129 17:03:20.600916 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="2b0be43b-8956-45aa-ad50-de9183b3fea3" Jan 29 17:03:23 crc kubenswrapper[4886]: I0129 17:03:23.614798 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:03:23 crc kubenswrapper[4886]: E0129 17:03:23.615517 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:03:25 crc kubenswrapper[4886]: E0129 17:03:25.291967 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 29 17:03:25 crc kubenswrapper[4886]: E0129 17:03:25.292245 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lhfqx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4cgwx_openstack(204a721b-36ee-4631-8358-f2511f332249): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:25 crc kubenswrapper[4886]: E0129 17:03:25.294185 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" podUID="204a721b-36ee-4631-8358-f2511f332249" Jan 29 17:03:26 crc kubenswrapper[4886]: E0129 17:03:26.647371 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 29 17:03:26 crc kubenswrapper[4886]: E0129 17:03:26.647898 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bpbmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(9d0db9ae-746b-419a-bc61-bf85645d2bff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:26 crc kubenswrapper[4886]: E0129 17:03:26.649785 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="9d0db9ae-746b-419a-bc61-bf85645d2bff" Jan 29 17:03:26 crc kubenswrapper[4886]: E0129 17:03:26.677496 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="9d0db9ae-746b-419a-bc61-bf85645d2bff" Jan 29 17:03:30 crc kubenswrapper[4886]: E0129 17:03:30.083913 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 29 17:03:30 crc kubenswrapper[4886]: E0129 17:03:30.084624 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7jjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-pmcr7_openstack(2f1c4419-6120-44b9-853c-7a42391db3e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:30 crc kubenswrapper[4886]: E0129 17:03:30.085842 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" podUID="2f1c4419-6120-44b9-853c-7a42391db3e7" Jan 29 17:03:31 crc kubenswrapper[4886]: E0129 17:03:31.691991 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 29 17:03:31 crc kubenswrapper[4886]: E0129 17:03:31.692937 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6zcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-tn5pt_openstack(3748c627-3deb-4b89-acd3-2269f42ba343): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:31 crc kubenswrapper[4886]: E0129 17:03:31.694122 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" Jan 29 17:03:31 crc kubenswrapper[4886]: E0129 17:03:31.715570 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.508175 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.519093 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.615897 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-dns-svc\") pod \"204a721b-36ee-4631-8358-f2511f332249\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.615988 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1c4419-6120-44b9-853c-7a42391db3e7-config\") pod \"2f1c4419-6120-44b9-853c-7a42391db3e7\" (UID: \"2f1c4419-6120-44b9-853c-7a42391db3e7\") " Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.616151 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhfqx\" (UniqueName: \"kubernetes.io/projected/204a721b-36ee-4631-8358-f2511f332249-kube-api-access-lhfqx\") pod \"204a721b-36ee-4631-8358-f2511f332249\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.616248 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7jjt\" (UniqueName: \"kubernetes.io/projected/2f1c4419-6120-44b9-853c-7a42391db3e7-kube-api-access-q7jjt\") pod \"2f1c4419-6120-44b9-853c-7a42391db3e7\" (UID: \"2f1c4419-6120-44b9-853c-7a42391db3e7\") " Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.616273 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-config\") pod \"204a721b-36ee-4631-8358-f2511f332249\" (UID: \"204a721b-36ee-4631-8358-f2511f332249\") " Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.621879 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-config" (OuterVolumeSpecName: "config") pod "204a721b-36ee-4631-8358-f2511f332249" (UID: "204a721b-36ee-4631-8358-f2511f332249"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.622235 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "204a721b-36ee-4631-8358-f2511f332249" (UID: "204a721b-36ee-4631-8358-f2511f332249"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.622544 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1c4419-6120-44b9-853c-7a42391db3e7-config" (OuterVolumeSpecName: "config") pod "2f1c4419-6120-44b9-853c-7a42391db3e7" (UID: "2f1c4419-6120-44b9-853c-7a42391db3e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.629290 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1c4419-6120-44b9-853c-7a42391db3e7-kube-api-access-q7jjt" (OuterVolumeSpecName: "kube-api-access-q7jjt") pod "2f1c4419-6120-44b9-853c-7a42391db3e7" (UID: "2f1c4419-6120-44b9-853c-7a42391db3e7"). InnerVolumeSpecName "kube-api-access-q7jjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.637115 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204a721b-36ee-4631-8358-f2511f332249-kube-api-access-lhfqx" (OuterVolumeSpecName: "kube-api-access-lhfqx") pod "204a721b-36ee-4631-8358-f2511f332249" (UID: "204a721b-36ee-4631-8358-f2511f332249"). InnerVolumeSpecName "kube-api-access-lhfqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.718307 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7jjt\" (UniqueName: \"kubernetes.io/projected/2f1c4419-6120-44b9-853c-7a42391db3e7-kube-api-access-q7jjt\") on node \"crc\" DevicePath \"\"" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.718359 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.718371 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204a721b-36ee-4631-8358-f2511f332249-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.718382 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1c4419-6120-44b9-853c-7a42391db3e7-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.718394 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhfqx\" (UniqueName: \"kubernetes.io/projected/204a721b-36ee-4631-8358-f2511f332249-kube-api-access-lhfqx\") on node \"crc\" DevicePath \"\"" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.741096 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" event={"ID":"204a721b-36ee-4631-8358-f2511f332249","Type":"ContainerDied","Data":"b0ce5d271c3a87e35c87ccbefa1e0c1a96ac0ecd541d22ead6b84099a6bd1679"} Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.741141 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4cgwx" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.742675 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" event={"ID":"2f1c4419-6120-44b9-853c-7a42391db3e7","Type":"ContainerDied","Data":"617c1fe920842500bf22662dbcff00fb4394c8a8a4577281f837a4ae20881073"} Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.742726 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pmcr7" Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.821162 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cgwx"] Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.840871 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cgwx"] Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.865476 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pmcr7"] Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.873135 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pmcr7"] Jan 29 17:03:33 crc kubenswrapper[4886]: I0129 17:03:33.998146 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c"] Jan 29 17:03:34 crc kubenswrapper[4886]: E0129 17:03:34.085702 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 29 17:03:34 crc kubenswrapper[4886]: E0129 17:03:34.086276 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kb44s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-bqbqx_openstack(6508ccc6-d71f-449d-bbe1-83270d005815): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:34 crc kubenswrapper[4886]: E0129 17:03:34.087543 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.201232 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.230964 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b7d9p"] Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.243933 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69c97cc7f-npplt"] Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.367006 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.625469 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204a721b-36ee-4631-8358-f2511f332249" path="/var/lib/kubelet/pods/204a721b-36ee-4631-8358-f2511f332249/volumes" Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.626272 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1c4419-6120-44b9-853c-7a42391db3e7" path="/var/lib/kubelet/pods/2f1c4419-6120-44b9-853c-7a42391db3e7/volumes" Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.751612 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c97cc7f-npplt" event={"ID":"57bcd464-9c19-451c-b1e7-ec31c75da5dd","Type":"ContainerStarted","Data":"674a4a84e7a661a8a9f9dcf78ec6c308fb06c693f936096b0c80bdfde2f814ca"} Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.753135 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b7d9p" event={"ID":"544b4515-481c-47f1-acb6-ed332a3497d4","Type":"ContainerStarted","Data":"f570984e5e7ce5895c501c3a0b3df5c2874fac80c1bf029801391a0fe3f26640"} Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.754072 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerStarted","Data":"38705f04f0f2e20b7f5d72009f437278994e72d7c6d255707ef36ddaf6f80953"} Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.755111 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" event={"ID":"ee1da890-a690-46b4-95aa-3f282b3cdc30","Type":"ContainerStarted","Data":"5fcc926e1a39bebeb290fc957f217493a2334ebaf02787d1068fc6d4a8c4f42a"} Jan 29 17:03:34 crc kubenswrapper[4886]: I0129 17:03:34.756753 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dba0c99a-0f14-42bd-8822-ee79fc73ee41","Type":"ContainerStarted","Data":"e23683912c13c24ac6376c0e92dd23177282cc9bf4441644e7ddbf8a433b486b"} Jan 29 17:03:35 crc kubenswrapper[4886]: I0129 17:03:35.615262 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:03:35 crc kubenswrapper[4886]: E0129 17:03:35.615739 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:03:35 crc kubenswrapper[4886]: I0129 17:03:35.767154 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c97cc7f-npplt" event={"ID":"57bcd464-9c19-451c-b1e7-ec31c75da5dd","Type":"ContainerStarted","Data":"201fa2a5b2a106ae890063199356cfaf006a51f40787274a6ba75e8d67e88aaa"} Jan 29 17:03:36 crc kubenswrapper[4886]: I0129 17:03:36.813151 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69c97cc7f-npplt" podStartSLOduration=65.813119711 podStartE2EDuration="1m5.813119711s" podCreationTimestamp="2026-01-29 17:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:03:36.7956357 +0000 UTC m=+2499.704355022" watchObservedRunningTime="2026-01-29 17:03:36.813119711 +0000 UTC m=+2499.721839023" Jan 29 17:03:37 crc kubenswrapper[4886]: E0129 17:03:37.945468 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Jan 29 17:03:37 crc kubenswrapper[4886]: E0129 17:03:37.945747 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n574h5h568h87h5bch65fh68fh74h644h546h64bh68ch9bh79h54ch6ch5b5h69hd9h684hf7h649h68dh54bh66ch656h5fh78h5b9h549hd9h4q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vq5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(88c8ef15-a2b1-41df-8048-752b56d26653): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:37 crc kubenswrapper[4886]: E0129 17:03:37.947213 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="88c8ef15-a2b1-41df-8048-752b56d26653" Jan 29 17:03:40 crc kubenswrapper[4886]: E0129 17:03:40.127406 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="88c8ef15-a2b1-41df-8048-752b56d26653" Jan 29 17:03:40 crc kubenswrapper[4886]: E0129 17:03:40.796772 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" Jan 29 17:03:41 crc kubenswrapper[4886]: I0129 17:03:41.752423 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xhds2"] Jan 29 17:03:42 crc kubenswrapper[4886]: I0129 17:03:42.383585 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:03:42 crc kubenswrapper[4886]: I0129 17:03:42.383676 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:03:42 crc kubenswrapper[4886]: I0129 17:03:42.389714 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:03:42 crc kubenswrapper[4886]: I0129 17:03:42.702148 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 17:03:42 crc kubenswrapper[4886]: I0129 17:03:42.843358 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69c97cc7f-npplt" Jan 29 17:03:42 crc kubenswrapper[4886]: I0129 17:03:42.871239 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 17:03:42 crc kubenswrapper[4886]: I0129 17:03:42.927879 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d44f9f6d-wvkcd"] Jan 29 17:03:42 crc kubenswrapper[4886]: E0129 17:03:42.949043 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 29 17:03:42 crc kubenswrapper[4886]: E0129 17:03:42.949229 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k7khl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(954d7d1e-fd92-4c83-87d8-87a1f866dbbe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:42 crc kubenswrapper[4886]: E0129 17:03:42.950989 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="954d7d1e-fd92-4c83-87d8-87a1f866dbbe" Jan 29 17:03:43 crc kubenswrapper[4886]: E0129 17:03:43.014495 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 29 17:03:43 crc kubenswrapper[4886]: E0129 17:03:43.014680 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2mz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(98bed306-aa68-4e53-affc-e04497079ccb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:03:43 crc kubenswrapper[4886]: E0129 17:03:43.015838 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="98bed306-aa68-4e53-affc-e04497079ccb" Jan 29 17:03:43 crc kubenswrapper[4886]: E0129 17:03:43.847990 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="98bed306-aa68-4e53-affc-e04497079ccb" Jan 29 17:03:43 crc kubenswrapper[4886]: E0129 17:03:43.848016 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="954d7d1e-fd92-4c83-87d8-87a1f866dbbe" Jan 29 17:03:46 crc kubenswrapper[4886]: I0129 17:03:46.615449 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:03:46 crc kubenswrapper[4886]: E0129 17:03:46.616005 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:03:47 crc kubenswrapper[4886]: W0129 17:03:47.458603 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03dc141f_69cc_4cb4_af0b_acf85642b86e.slice/crio-8a8b9b3d461cc6336b71e2f4a1f54440c360e4b681c82c16795bce27f841af7e WatchSource:0}: Error finding container 8a8b9b3d461cc6336b71e2f4a1f54440c360e4b681c82c16795bce27f841af7e: Status 404 returned error can't find the container with id 8a8b9b3d461cc6336b71e2f4a1f54440c360e4b681c82c16795bce27f841af7e Jan 29 17:03:47 crc kubenswrapper[4886]: I0129 17:03:47.885299 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xhds2" event={"ID":"03dc141f-69cc-4cb4-af0b-acf85642b86e","Type":"ContainerStarted","Data":"8a8b9b3d461cc6336b71e2f4a1f54440c360e4b681c82c16795bce27f841af7e"} Jan 29 17:03:47 crc kubenswrapper[4886]: I0129 17:03:47.887115 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b015d0c-8672-450a-a079-965cc4ccd07f","Type":"ContainerStarted","Data":"55fb09172ecfe543ed3055282effeb7cac42ad3317ded6fadc58a6e1afee04a0"} Jan 29 17:03:48 crc kubenswrapper[4886]: W0129 17:03:48.298959 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39601bb5_f2bc_47a6_824a_609c207b963f.slice/crio-432d536255059a87132e92e40237fe7c882a36d7e32055ccf635103518ecbec9 WatchSource:0}: Error finding container 432d536255059a87132e92e40237fe7c882a36d7e32055ccf635103518ecbec9: Status 404 returned error can't find the container with id 432d536255059a87132e92e40237fe7c882a36d7e32055ccf635103518ecbec9 Jan 29 17:03:48 crc kubenswrapper[4886]: I0129 17:03:48.900391 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"39601bb5-f2bc-47a6-824a-609c207b963f","Type":"ContainerStarted","Data":"432d536255059a87132e92e40237fe7c882a36d7e32055ccf635103518ecbec9"} Jan 29 17:03:49 crc kubenswrapper[4886]: I0129 17:03:49.911938 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b7d9p" event={"ID":"544b4515-481c-47f1-acb6-ed332a3497d4","Type":"ContainerStarted","Data":"31925dc2b4451bded2a4f8317ce799c155f8528fe1011988d10f0aa3ff739d00"} Jan 29 17:03:49 crc kubenswrapper[4886]: I0129 17:03:49.912466 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-b7d9p" Jan 29 17:03:49 crc kubenswrapper[4886]: I0129 17:03:49.936112 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-b7d9p" podStartSLOduration=61.849424328 podStartE2EDuration="1m16.936090016s" podCreationTimestamp="2026-01-29 17:02:33 +0000 UTC" firstStartedPulling="2026-01-29 17:03:34.257614218 +0000 UTC m=+2497.166333490" lastFinishedPulling="2026-01-29 17:03:49.344279906 +0000 UTC m=+2512.252999178" observedRunningTime="2026-01-29 17:03:49.93008498 +0000 UTC m=+2512.838804262" watchObservedRunningTime="2026-01-29 17:03:49.936090016 +0000 UTC m=+2512.844809288" Jan 29 17:03:51 crc kubenswrapper[4886]: I0129 17:03:51.495430 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" event={"ID":"ee1da890-a690-46b4-95aa-3f282b3cdc30","Type":"ContainerStarted","Data":"5688c50792f3c3255c84d31bad3708c97035368da7715c74f1a56056b63a6746"} Jan 29 17:03:51 crc kubenswrapper[4886]: I0129 17:03:51.498024 4886 generic.go:334] "Generic (PLEG): container finished" podID="3748c627-3deb-4b89-acd3-2269f42ba343" containerID="fcac16ce7b565761d87666d9cf26f0b7bab43d40d9fedf5938d903160f00e164" exitCode=0 Jan 29 17:03:51 crc kubenswrapper[4886]: I0129 17:03:51.498071 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" event={"ID":"3748c627-3deb-4b89-acd3-2269f42ba343","Type":"ContainerDied","Data":"fcac16ce7b565761d87666d9cf26f0b7bab43d40d9fedf5938d903160f00e164"} Jan 29 17:03:51 crc kubenswrapper[4886]: I0129 17:03:51.506137 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"842bfe4d-04ba-4143-9076-3033163c7b82","Type":"ContainerStarted","Data":"5c98fb62cf57fb19a685fed0c362721e82c04b5d528f5ad7579c1412f1f79e81"} Jan 29 17:03:51 crc kubenswrapper[4886]: I0129 17:03:51.526589 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10","Type":"ContainerStarted","Data":"e164b2712bb12971248661528d0d661417a2f6869697cd179a3843bd4e2721f1"} Jan 29 17:03:51 crc kubenswrapper[4886]: I0129 17:03:51.539515 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-ld46c" podStartSLOduration=64.791384381 podStartE2EDuration="1m20.539497287s" podCreationTimestamp="2026-01-29 17:02:31 +0000 UTC" firstStartedPulling="2026-01-29 17:03:34.017638278 +0000 UTC m=+2496.926357550" lastFinishedPulling="2026-01-29 17:03:49.765751184 +0000 UTC m=+2512.674470456" observedRunningTime="2026-01-29 17:03:51.518709255 +0000 UTC m=+2514.427428537" watchObservedRunningTime="2026-01-29 17:03:51.539497287 +0000 UTC m=+2514.448216559" Jan 29 17:03:51 crc kubenswrapper[4886]: E0129 17:03:51.837010 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 29 17:03:51 crc kubenswrapper[4886]: E0129 17:03:51.837393 4886 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 29 17:03:51 crc kubenswrapper[4886]: E0129 17:03:51.837622 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xrp8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(dba0c99a-0f14-42bd-8822-ee79fc73ee41): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" logger="UnhandledError" Jan 29 17:03:51 crc kubenswrapper[4886]: E0129 17:03:51.839043 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="dba0c99a-0f14-42bd-8822-ee79fc73ee41" Jan 29 17:03:52 crc kubenswrapper[4886]: I0129 17:03:52.537542 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9d0db9ae-746b-419a-bc61-bf85645d2bff","Type":"ContainerStarted","Data":"90c62e1af999c12bd3cee48206c3c037d5e41331e61dd2c2d6e99f50a71acbba"} Jan 29 17:03:52 crc kubenswrapper[4886]: I0129 17:03:52.539302 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b0be43b-8956-45aa-ad50-de9183b3fea3","Type":"ContainerStarted","Data":"121b418980e461ff82cc0059422b3aec6e494e5fd4c123ffbab962202999757c"} Jan 29 17:03:52 crc kubenswrapper[4886]: E0129 17:03:52.541246 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="dba0c99a-0f14-42bd-8822-ee79fc73ee41" Jan 29 17:03:54 crc kubenswrapper[4886]: I0129 17:03:54.577861 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"39601bb5-f2bc-47a6-824a-609c207b963f","Type":"ContainerStarted","Data":"bf239556e30f9137e020bc2a6c81d2fdb898af7395a712452ca6968c9abdf04d"} Jan 29 17:03:54 crc kubenswrapper[4886]: I0129 17:03:54.587910 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b015d0c-8672-450a-a079-965cc4ccd07f","Type":"ContainerStarted","Data":"45a24140137200a26c74210530849bf906a138a61cb80a258cb55968228dcfec"} Jan 29 17:03:54 crc kubenswrapper[4886]: I0129 17:03:54.592134 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" event={"ID":"3748c627-3deb-4b89-acd3-2269f42ba343","Type":"ContainerStarted","Data":"85f248c363891313b6dfd3563ffece575be09f0a7b8fb96dd58a65634816d1bc"} Jan 29 17:03:54 crc kubenswrapper[4886]: I0129 17:03:54.592487 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:03:54 crc kubenswrapper[4886]: I0129 17:03:54.597372 4886 generic.go:334] "Generic (PLEG): container finished" podID="03dc141f-69cc-4cb4-af0b-acf85642b86e" containerID="99eea0285f6f5f01492d9cbe469c801bd291548fbbceb2527113ae1fb3f63482" exitCode=0 Jan 29 17:03:54 crc kubenswrapper[4886]: I0129 17:03:54.597430 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xhds2" event={"ID":"03dc141f-69cc-4cb4-af0b-acf85642b86e","Type":"ContainerDied","Data":"99eea0285f6f5f01492d9cbe469c801bd291548fbbceb2527113ae1fb3f63482"} Jan 29 17:03:54 crc kubenswrapper[4886]: I0129 17:03:54.616625 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" podStartSLOduration=6.509872275 podStartE2EDuration="1m31.616607507s" podCreationTimestamp="2026-01-29 17:02:23 +0000 UTC" firstStartedPulling="2026-01-29 17:02:24.744303361 +0000 UTC m=+2427.653022633" lastFinishedPulling="2026-01-29 17:03:49.851038603 +0000 UTC m=+2512.759757865" observedRunningTime="2026-01-29 17:03:54.609993765 +0000 UTC m=+2517.518713057" watchObservedRunningTime="2026-01-29 17:03:54.616607507 +0000 UTC m=+2517.525326779" Jan 29 17:03:55 crc kubenswrapper[4886]: I0129 17:03:55.613456 4886 generic.go:334] "Generic (PLEG): container finished" podID="6508ccc6-d71f-449d-bbe1-83270d005815" containerID="89f82f42c505d87726312a538c1469519937b08750e6ec80466cc82da8aa0837" exitCode=0 Jan 29 17:03:55 crc kubenswrapper[4886]: I0129 17:03:55.613541 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" event={"ID":"6508ccc6-d71f-449d-bbe1-83270d005815","Type":"ContainerDied","Data":"89f82f42c505d87726312a538c1469519937b08750e6ec80466cc82da8aa0837"} Jan 29 17:03:55 crc kubenswrapper[4886]: I0129 17:03:55.619019 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"88c8ef15-a2b1-41df-8048-752b56d26653","Type":"ContainerStarted","Data":"10ebf425973cf40d094dde67b66d655c13aa2955f48ae0a6b4c41a153e79e60c"} Jan 29 17:03:55 crc kubenswrapper[4886]: I0129 17:03:55.619669 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 29 17:03:55 crc kubenswrapper[4886]: I0129 17:03:55.622961 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xhds2" event={"ID":"03dc141f-69cc-4cb4-af0b-acf85642b86e","Type":"ContainerStarted","Data":"948b2e4020afbef71d55c5d817cc8c2776b65ce432a68964ab8a4796a4e42a9e"} Jan 29 17:03:55 crc kubenswrapper[4886]: I0129 17:03:55.623007 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xhds2" event={"ID":"03dc141f-69cc-4cb4-af0b-acf85642b86e","Type":"ContainerStarted","Data":"4846a08c86363260321db374e444fb76c2cb5ca480f29eedecee09311b820036"} Jan 29 17:03:55 crc kubenswrapper[4886]: I0129 17:03:55.657885 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.356490074 podStartE2EDuration="1m27.657859076s" podCreationTimestamp="2026-01-29 17:02:28 +0000 UTC" firstStartedPulling="2026-01-29 17:02:29.131053442 +0000 UTC m=+2432.039772714" lastFinishedPulling="2026-01-29 17:03:54.432422444 +0000 UTC m=+2517.341141716" observedRunningTime="2026-01-29 17:03:55.653010363 +0000 UTC m=+2518.561729645" watchObservedRunningTime="2026-01-29 17:03:55.657859076 +0000 UTC m=+2518.566578348" Jan 29 17:03:55 crc kubenswrapper[4886]: I0129 17:03:55.673729 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-xhds2" podStartSLOduration=76.705789184 podStartE2EDuration="1m22.673709423s" podCreationTimestamp="2026-01-29 17:02:33 +0000 UTC" firstStartedPulling="2026-01-29 17:03:47.463694631 +0000 UTC m=+2510.372413903" lastFinishedPulling="2026-01-29 17:03:53.43161487 +0000 UTC m=+2516.340334142" observedRunningTime="2026-01-29 17:03:55.671368148 +0000 UTC m=+2518.580087450" watchObservedRunningTime="2026-01-29 17:03:55.673709423 +0000 UTC m=+2518.582428705" Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.637059 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"39601bb5-f2bc-47a6-824a-609c207b963f","Type":"ContainerStarted","Data":"890150ec302223a8e2d169c0d885780677db4f8f7357b4039823615911ec1fdd"} Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.639693 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerStarted","Data":"583c2c73cc1b55ad9f4f022652302dc10ae77e94e45a693b0865ff8b717978ab"} Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.641415 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b015d0c-8672-450a-a079-965cc4ccd07f","Type":"ContainerStarted","Data":"9a0f61abb3b2a2a9f53d2b44347a687f4bb47ba68928cafd5016f226170d4374"} Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.645402 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" event={"ID":"6508ccc6-d71f-449d-bbe1-83270d005815","Type":"ContainerStarted","Data":"551d6bb92bd8b9f6b94728550021f0d9b88f84765724d42a9ae9096869fe7939"} Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.645589 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.648263 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"954d7d1e-fd92-4c83-87d8-87a1f866dbbe","Type":"ContainerStarted","Data":"01b438318caf5eaf9a57468dc2cc9bed9f702f5dc44dd9743a37737048ccabed"} Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.648304 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.648577 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.691755 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=73.169336337 podStartE2EDuration="1m20.69172979s" podCreationTimestamp="2026-01-29 17:02:36 +0000 UTC" firstStartedPulling="2026-01-29 17:03:48.397666665 +0000 UTC m=+2511.306385937" lastFinishedPulling="2026-01-29 17:03:55.920060118 +0000 UTC m=+2518.828779390" observedRunningTime="2026-01-29 17:03:56.687290278 +0000 UTC m=+2519.596009550" watchObservedRunningTime="2026-01-29 17:03:56.69172979 +0000 UTC m=+2519.600449062" Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.731197 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=72.248392743 podStartE2EDuration="1m20.731173157s" podCreationTimestamp="2026-01-29 17:02:36 +0000 UTC" firstStartedPulling="2026-01-29 17:03:47.420022468 +0000 UTC m=+2510.328741740" lastFinishedPulling="2026-01-29 17:03:55.902802882 +0000 UTC m=+2518.811522154" observedRunningTime="2026-01-29 17:03:56.718704313 +0000 UTC m=+2519.627423585" watchObservedRunningTime="2026-01-29 17:03:56.731173157 +0000 UTC m=+2519.639892429" Jan 29 17:03:56 crc kubenswrapper[4886]: I0129 17:03:56.746653 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" podStartSLOduration=-9223371944.10814 podStartE2EDuration="1m32.746634003s" podCreationTimestamp="2026-01-29 17:02:24 +0000 UTC" firstStartedPulling="2026-01-29 17:02:25.070696641 +0000 UTC m=+2427.979415913" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:03:56.746450697 +0000 UTC m=+2519.655169969" watchObservedRunningTime="2026-01-29 17:03:56.746634003 +0000 UTC m=+2519.655353275" Jan 29 17:03:57 crc kubenswrapper[4886]: I0129 17:03:57.588261 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 29 17:03:57 crc kubenswrapper[4886]: I0129 17:03:57.887962 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 29 17:03:58 crc kubenswrapper[4886]: I0129 17:03:58.587404 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 29 17:03:58 crc kubenswrapper[4886]: I0129 17:03:58.642318 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 29 17:03:58 crc kubenswrapper[4886]: I0129 17:03:58.664875 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"98bed306-aa68-4e53-affc-e04497079ccb","Type":"ContainerStarted","Data":"13269c792a56983291098b79dde6fcee3fc61558ea51917d6a60175381efc4fc"} Jan 29 17:03:58 crc kubenswrapper[4886]: I0129 17:03:58.887850 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 29 17:03:58 crc kubenswrapper[4886]: I0129 17:03:58.930372 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 29 17:03:59 crc kubenswrapper[4886]: I0129 17:03:59.124615 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:03:59 crc kubenswrapper[4886]: I0129 17:03:59.615265 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:03:59 crc kubenswrapper[4886]: E0129 17:03:59.615545 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:03:59 crc kubenswrapper[4886]: I0129 17:03:59.722298 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 29 17:03:59 crc kubenswrapper[4886]: I0129 17:03:59.730082 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.004275 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bqbqx"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.005194 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" containerName="dnsmasq-dns" containerID="cri-o://551d6bb92bd8b9f6b94728550021f0d9b88f84765724d42a9ae9096869fe7939" gracePeriod=10 Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.040906 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-6lgfs"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.042545 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.046643 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.055012 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-6lgfs"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.148654 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.148975 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.149095 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk4z9\" (UniqueName: \"kubernetes.io/projected/c05aff31-e011-4872-80bf-18f1b32a16e6-kube-api-access-sk4z9\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.149131 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-config\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.173947 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6f8zt"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.175648 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.177837 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.185409 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6f8zt"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250582 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ff160c34-86ad-4048-9c67-2071e6c38373-ovs-rundir\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250658 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmj5h\" (UniqueName: \"kubernetes.io/projected/ff160c34-86ad-4048-9c67-2071e6c38373-kube-api-access-pmj5h\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250734 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk4z9\" (UniqueName: \"kubernetes.io/projected/c05aff31-e011-4872-80bf-18f1b32a16e6-kube-api-access-sk4z9\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250777 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ff160c34-86ad-4048-9c67-2071e6c38373-ovn-rundir\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250814 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff160c34-86ad-4048-9c67-2071e6c38373-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250846 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff160c34-86ad-4048-9c67-2071e6c38373-combined-ca-bundle\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250875 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-config\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250908 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff160c34-86ad-4048-9c67-2071e6c38373-config\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.250990 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.251080 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.251960 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-config\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.260679 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-6lgfs"] Jan 29 17:04:00 crc kubenswrapper[4886]: E0129 17:04:00.261434 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[dns-svc kube-api-access-sk4z9 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" podUID="c05aff31-e011-4872-80bf-18f1b32a16e6" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.264817 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.264866 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.292354 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk4z9\" (UniqueName: \"kubernetes.io/projected/c05aff31-e011-4872-80bf-18f1b32a16e6-kube-api-access-sk4z9\") pod \"dnsmasq-dns-7f896c8c65-6lgfs\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.297253 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.299003 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.301115 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.301397 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.301556 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.301662 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-87p4g" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.312785 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-29gw9"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.315049 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.320209 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.334713 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.355970 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc04c928-b93c-49a3-a653-f82b5e686da5-scripts\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356031 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc04c928-b93c-49a3-a653-f82b5e686da5-config\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356089 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356117 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jqzk\" (UniqueName: \"kubernetes.io/projected/dc04c928-b93c-49a3-a653-f82b5e686da5-kube-api-access-8jqzk\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356162 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356205 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc04c928-b93c-49a3-a653-f82b5e686da5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356228 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356261 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ff160c34-86ad-4048-9c67-2071e6c38373-ovs-rundir\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356293 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmj5h\" (UniqueName: \"kubernetes.io/projected/ff160c34-86ad-4048-9c67-2071e6c38373-kube-api-access-pmj5h\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356373 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ff160c34-86ad-4048-9c67-2071e6c38373-ovn-rundir\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356400 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff160c34-86ad-4048-9c67-2071e6c38373-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356425 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff160c34-86ad-4048-9c67-2071e6c38373-combined-ca-bundle\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.356461 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff160c34-86ad-4048-9c67-2071e6c38373-config\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.357571 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ff160c34-86ad-4048-9c67-2071e6c38373-ovs-rundir\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.358077 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ff160c34-86ad-4048-9c67-2071e6c38373-ovn-rundir\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.368018 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff160c34-86ad-4048-9c67-2071e6c38373-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.376438 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff160c34-86ad-4048-9c67-2071e6c38373-config\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.387088 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff160c34-86ad-4048-9c67-2071e6c38373-combined-ca-bundle\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.416199 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-29gw9"] Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.420851 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmj5h\" (UniqueName: \"kubernetes.io/projected/ff160c34-86ad-4048-9c67-2071e6c38373-kube-api-access-pmj5h\") pod \"ovn-controller-metrics-6f8zt\" (UID: \"ff160c34-86ad-4048-9c67-2071e6c38373\") " pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463475 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc04c928-b93c-49a3-a653-f82b5e686da5-config\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463534 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463564 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5gl6\" (UniqueName: \"kubernetes.io/projected/4ef7b166-c078-4530-b05b-ae3e44088122-kube-api-access-h5gl6\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463583 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jqzk\" (UniqueName: \"kubernetes.io/projected/dc04c928-b93c-49a3-a653-f82b5e686da5-kube-api-access-8jqzk\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463633 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463662 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463680 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463735 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc04c928-b93c-49a3-a653-f82b5e686da5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463761 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463833 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-config\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463849 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.463901 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc04c928-b93c-49a3-a653-f82b5e686da5-scripts\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.464883 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc04c928-b93c-49a3-a653-f82b5e686da5-scripts\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.465421 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc04c928-b93c-49a3-a653-f82b5e686da5-config\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.466567 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc04c928-b93c-49a3-a653-f82b5e686da5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.485086 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.495637 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.496516 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04c928-b93c-49a3-a653-f82b5e686da5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.509601 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jqzk\" (UniqueName: \"kubernetes.io/projected/dc04c928-b93c-49a3-a653-f82b5e686da5-kube-api-access-8jqzk\") pod \"ovn-northd-0\" (UID: \"dc04c928-b93c-49a3-a653-f82b5e686da5\") " pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.542349 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6f8zt" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.566345 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-config\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.566383 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.566464 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5gl6\" (UniqueName: \"kubernetes.io/projected/4ef7b166-c078-4530-b05b-ae3e44088122-kube-api-access-h5gl6\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.566509 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.566529 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.567447 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.568007 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-config\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.568353 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.569512 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.590166 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.627167 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5gl6\" (UniqueName: \"kubernetes.io/projected/4ef7b166-c078-4530-b05b-ae3e44088122-kube-api-access-h5gl6\") pod \"dnsmasq-dns-86db49b7ff-29gw9\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.728438 4886 generic.go:334] "Generic (PLEG): container finished" podID="6508ccc6-d71f-449d-bbe1-83270d005815" containerID="551d6bb92bd8b9f6b94728550021f0d9b88f84765724d42a9ae9096869fe7939" exitCode=0 Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.728975 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" event={"ID":"6508ccc6-d71f-449d-bbe1-83270d005815","Type":"ContainerDied","Data":"551d6bb92bd8b9f6b94728550021f0d9b88f84765724d42a9ae9096869fe7939"} Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.729094 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.756266 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.756800 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.878838 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-dns-svc\") pod \"c05aff31-e011-4872-80bf-18f1b32a16e6\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.879208 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk4z9\" (UniqueName: \"kubernetes.io/projected/c05aff31-e011-4872-80bf-18f1b32a16e6-kube-api-access-sk4z9\") pod \"c05aff31-e011-4872-80bf-18f1b32a16e6\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.879289 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-config\") pod \"c05aff31-e011-4872-80bf-18f1b32a16e6\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.879336 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-ovsdbserver-sb\") pod \"c05aff31-e011-4872-80bf-18f1b32a16e6\" (UID: \"c05aff31-e011-4872-80bf-18f1b32a16e6\") " Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.879396 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-config\") pod \"6508ccc6-d71f-449d-bbe1-83270d005815\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.879404 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c05aff31-e011-4872-80bf-18f1b32a16e6" (UID: "c05aff31-e011-4872-80bf-18f1b32a16e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.879423 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-dns-svc\") pod \"6508ccc6-d71f-449d-bbe1-83270d005815\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.879496 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb44s\" (UniqueName: \"kubernetes.io/projected/6508ccc6-d71f-449d-bbe1-83270d005815-kube-api-access-kb44s\") pod \"6508ccc6-d71f-449d-bbe1-83270d005815\" (UID: \"6508ccc6-d71f-449d-bbe1-83270d005815\") " Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.879670 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-config" (OuterVolumeSpecName: "config") pod "c05aff31-e011-4872-80bf-18f1b32a16e6" (UID: "c05aff31-e011-4872-80bf-18f1b32a16e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.880264 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.880284 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.884700 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05aff31-e011-4872-80bf-18f1b32a16e6-kube-api-access-sk4z9" (OuterVolumeSpecName: "kube-api-access-sk4z9") pod "c05aff31-e011-4872-80bf-18f1b32a16e6" (UID: "c05aff31-e011-4872-80bf-18f1b32a16e6"). InnerVolumeSpecName "kube-api-access-sk4z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.887237 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c05aff31-e011-4872-80bf-18f1b32a16e6" (UID: "c05aff31-e011-4872-80bf-18f1b32a16e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.890296 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6508ccc6-d71f-449d-bbe1-83270d005815-kube-api-access-kb44s" (OuterVolumeSpecName: "kube-api-access-kb44s") pod "6508ccc6-d71f-449d-bbe1-83270d005815" (UID: "6508ccc6-d71f-449d-bbe1-83270d005815"). InnerVolumeSpecName "kube-api-access-kb44s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.915911 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.936148 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6508ccc6-d71f-449d-bbe1-83270d005815" (UID: "6508ccc6-d71f-449d-bbe1-83270d005815"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.950554 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-config" (OuterVolumeSpecName: "config") pod "6508ccc6-d71f-449d-bbe1-83270d005815" (UID: "6508ccc6-d71f-449d-bbe1-83270d005815"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.981824 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.981855 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6508ccc6-d71f-449d-bbe1-83270d005815-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.981866 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb44s\" (UniqueName: \"kubernetes.io/projected/6508ccc6-d71f-449d-bbe1-83270d005815-kube-api-access-kb44s\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.981876 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk4z9\" (UniqueName: \"kubernetes.io/projected/c05aff31-e011-4872-80bf-18f1b32a16e6-kube-api-access-sk4z9\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:00 crc kubenswrapper[4886]: I0129 17:04:00.981885 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05aff31-e011-4872-80bf-18f1b32a16e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.247204 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6f8zt"] Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.255379 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.470911 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-29gw9"] Jan 29 17:04:01 crc kubenswrapper[4886]: W0129 17:04:01.485149 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ef7b166_c078_4530_b05b_ae3e44088122.slice/crio-b30007dc7ac0cb559fa26a9b1b3904c3d91b03c66e5d4e617cb72bf920854daa WatchSource:0}: Error finding container b30007dc7ac0cb559fa26a9b1b3904c3d91b03c66e5d4e617cb72bf920854daa: Status 404 returned error can't find the container with id b30007dc7ac0cb559fa26a9b1b3904c3d91b03c66e5d4e617cb72bf920854daa Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.738224 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6f8zt" event={"ID":"ff160c34-86ad-4048-9c67-2071e6c38373","Type":"ContainerStarted","Data":"691ff7220e1361142913343ee9d06191daae7edcbc017e98673318e7c4dcf180"} Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.738281 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6f8zt" event={"ID":"ff160c34-86ad-4048-9c67-2071e6c38373","Type":"ContainerStarted","Data":"3f6d99f29803aa2336fff1711f2e7466d9a294ec38bb78c9a79953dda4e63501"} Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.740361 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.740361 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bqbqx" event={"ID":"6508ccc6-d71f-449d-bbe1-83270d005815","Type":"ContainerDied","Data":"3cb5dbf55000d2d62fd9df0707aa0b2ae3790c985165faca182a19e1e38e6908"} Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.740508 4886 scope.go:117] "RemoveContainer" containerID="551d6bb92bd8b9f6b94728550021f0d9b88f84765724d42a9ae9096869fe7939" Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.742395 4886 generic.go:334] "Generic (PLEG): container finished" podID="954d7d1e-fd92-4c83-87d8-87a1f866dbbe" containerID="01b438318caf5eaf9a57468dc2cc9bed9f702f5dc44dd9743a37737048ccabed" exitCode=0 Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.742450 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"954d7d1e-fd92-4c83-87d8-87a1f866dbbe","Type":"ContainerDied","Data":"01b438318caf5eaf9a57468dc2cc9bed9f702f5dc44dd9743a37737048ccabed"} Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.744678 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dc04c928-b93c-49a3-a653-f82b5e686da5","Type":"ContainerStarted","Data":"44432bb9efbb5eec2088da4bed39ca91585697f30ae31fbcae52c1a9fa8c6ba9"} Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.748297 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ef7b166-c078-4530-b05b-ae3e44088122" containerID="cbbe07486135ddfe120920c1f4f9ccadece896cbebac702a4fee9f0d2022f4db" exitCode=0 Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.748378 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-6lgfs" Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.748494 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" event={"ID":"4ef7b166-c078-4530-b05b-ae3e44088122","Type":"ContainerDied","Data":"cbbe07486135ddfe120920c1f4f9ccadece896cbebac702a4fee9f0d2022f4db"} Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.748548 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" event={"ID":"4ef7b166-c078-4530-b05b-ae3e44088122","Type":"ContainerStarted","Data":"b30007dc7ac0cb559fa26a9b1b3904c3d91b03c66e5d4e617cb72bf920854daa"} Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.801936 4886 scope.go:117] "RemoveContainer" containerID="89f82f42c505d87726312a538c1469519937b08750e6ec80466cc82da8aa0837" Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.826715 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6f8zt" podStartSLOduration=1.826695419 podStartE2EDuration="1.826695419s" podCreationTimestamp="2026-01-29 17:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:01.756438424 +0000 UTC m=+2524.665157696" watchObservedRunningTime="2026-01-29 17:04:01.826695419 +0000 UTC m=+2524.735414691" Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.930085 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-6lgfs"] Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.938611 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-6lgfs"] Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.946078 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bqbqx"] Jan 29 17:04:01 crc kubenswrapper[4886]: I0129 17:04:01.954082 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bqbqx"] Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.630913 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" path="/var/lib/kubelet/pods/6508ccc6-d71f-449d-bbe1-83270d005815/volumes" Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.632342 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c05aff31-e011-4872-80bf-18f1b32a16e6" path="/var/lib/kubelet/pods/c05aff31-e011-4872-80bf-18f1b32a16e6/volumes" Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.771077 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"954d7d1e-fd92-4c83-87d8-87a1f866dbbe","Type":"ContainerStarted","Data":"49bc2884b26abe4f9087c468400ed26f82e277abb56ff1ac1083e5b7f95edffe"} Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.774235 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" event={"ID":"4ef7b166-c078-4530-b05b-ae3e44088122","Type":"ContainerStarted","Data":"e0d2fbb581e1f1576641f1d25760b3a9a9b2fc1c9e7db710f6875c72957b1c0b"} Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.774614 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.776740 4886 generic.go:334] "Generic (PLEG): container finished" podID="98bed306-aa68-4e53-affc-e04497079ccb" containerID="13269c792a56983291098b79dde6fcee3fc61558ea51917d6a60175381efc4fc" exitCode=0 Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.777165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"98bed306-aa68-4e53-affc-e04497079ccb","Type":"ContainerDied","Data":"13269c792a56983291098b79dde6fcee3fc61558ea51917d6a60175381efc4fc"} Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.796386 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.955985326 podStartE2EDuration="1m36.796368756s" podCreationTimestamp="2026-01-29 17:02:26 +0000 UTC" firstStartedPulling="2026-01-29 17:02:29.422604692 +0000 UTC m=+2432.331323964" lastFinishedPulling="2026-01-29 17:03:56.262988122 +0000 UTC m=+2519.171707394" observedRunningTime="2026-01-29 17:04:02.793775154 +0000 UTC m=+2525.702494426" watchObservedRunningTime="2026-01-29 17:04:02.796368756 +0000 UTC m=+2525.705088028" Jan 29 17:04:02 crc kubenswrapper[4886]: I0129 17:04:02.822214 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" podStartSLOduration=2.822196007 podStartE2EDuration="2.822196007s" podCreationTimestamp="2026-01-29 17:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:02.815572455 +0000 UTC m=+2525.724291737" watchObservedRunningTime="2026-01-29 17:04:02.822196007 +0000 UTC m=+2525.730915269" Jan 29 17:04:03 crc kubenswrapper[4886]: I0129 17:04:03.496312 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 29 17:04:03 crc kubenswrapper[4886]: I0129 17:04:03.793465 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"98bed306-aa68-4e53-affc-e04497079ccb","Type":"ContainerStarted","Data":"5705babd04f038e45524f2765a20c44405227f6554f54075ed01b05809eea45e"} Jan 29 17:04:03 crc kubenswrapper[4886]: I0129 17:04:03.795124 4886 generic.go:334] "Generic (PLEG): container finished" podID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerID="583c2c73cc1b55ad9f4f022652302dc10ae77e94e45a693b0865ff8b717978ab" exitCode=0 Jan 29 17:04:03 crc kubenswrapper[4886]: I0129 17:04:03.795174 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerDied","Data":"583c2c73cc1b55ad9f4f022652302dc10ae77e94e45a693b0865ff8b717978ab"} Jan 29 17:04:03 crc kubenswrapper[4886]: I0129 17:04:03.832746 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371938.022049 podStartE2EDuration="1m38.832727299s" podCreationTimestamp="2026-01-29 17:02:25 +0000 UTC" firstStartedPulling="2026-01-29 17:02:27.530602762 +0000 UTC m=+2430.439322034" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:03.827656519 +0000 UTC m=+2526.736375801" watchObservedRunningTime="2026-01-29 17:04:03.832727299 +0000 UTC m=+2526.741446571" Jan 29 17:04:06 crc kubenswrapper[4886]: I0129 17:04:06.823583 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 29 17:04:06 crc kubenswrapper[4886]: I0129 17:04:06.823969 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 29 17:04:07 crc kubenswrapper[4886]: I0129 17:04:07.996537 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7d44f9f6d-wvkcd" podUID="d7eb0acf-dfc4-4c24-8231-bfae5b620653" containerName="console" containerID="cri-o://83d754bde6259c4ef4756a1b0a86efc202f6d81cccfa70e563b1ad9cae41b68f" gracePeriod=15 Jan 29 17:04:08 crc kubenswrapper[4886]: I0129 17:04:08.459633 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 29 17:04:08 crc kubenswrapper[4886]: I0129 17:04:08.459684 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 29 17:04:08 crc kubenswrapper[4886]: I0129 17:04:08.839072 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d44f9f6d-wvkcd_d7eb0acf-dfc4-4c24-8231-bfae5b620653/console/0.log" Jan 29 17:04:08 crc kubenswrapper[4886]: I0129 17:04:08.839320 4886 generic.go:334] "Generic (PLEG): container finished" podID="d7eb0acf-dfc4-4c24-8231-bfae5b620653" containerID="83d754bde6259c4ef4756a1b0a86efc202f6d81cccfa70e563b1ad9cae41b68f" exitCode=2 Jan 29 17:04:08 crc kubenswrapper[4886]: I0129 17:04:08.839360 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d44f9f6d-wvkcd" event={"ID":"d7eb0acf-dfc4-4c24-8231-bfae5b620653","Type":"ContainerDied","Data":"83d754bde6259c4ef4756a1b0a86efc202f6d81cccfa70e563b1ad9cae41b68f"} Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.491249 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d44f9f6d-wvkcd_d7eb0acf-dfc4-4c24-8231-bfae5b620653/console/0.log" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.492614 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.647744 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-config\") pod \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.647915 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-trusted-ca-bundle\") pod \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.647938 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-oauth-serving-cert\") pod \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.647980 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-serving-cert\") pod \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.648017 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-oauth-config\") pod \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.648039 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt776\" (UniqueName: \"kubernetes.io/projected/d7eb0acf-dfc4-4c24-8231-bfae5b620653-kube-api-access-vt776\") pod \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.648140 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-service-ca\") pod \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\" (UID: \"d7eb0acf-dfc4-4c24-8231-bfae5b620653\") " Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.648478 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d7eb0acf-dfc4-4c24-8231-bfae5b620653" (UID: "d7eb0acf-dfc4-4c24-8231-bfae5b620653"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.648500 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d7eb0acf-dfc4-4c24-8231-bfae5b620653" (UID: "d7eb0acf-dfc4-4c24-8231-bfae5b620653"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.648549 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-config" (OuterVolumeSpecName: "console-config") pod "d7eb0acf-dfc4-4c24-8231-bfae5b620653" (UID: "d7eb0acf-dfc4-4c24-8231-bfae5b620653"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.648965 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-service-ca" (OuterVolumeSpecName: "service-ca") pod "d7eb0acf-dfc4-4c24-8231-bfae5b620653" (UID: "d7eb0acf-dfc4-4c24-8231-bfae5b620653"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.653205 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d7eb0acf-dfc4-4c24-8231-bfae5b620653" (UID: "d7eb0acf-dfc4-4c24-8231-bfae5b620653"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.654044 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7eb0acf-dfc4-4c24-8231-bfae5b620653-kube-api-access-vt776" (OuterVolumeSpecName: "kube-api-access-vt776") pod "d7eb0acf-dfc4-4c24-8231-bfae5b620653" (UID: "d7eb0acf-dfc4-4c24-8231-bfae5b620653"). InnerVolumeSpecName "kube-api-access-vt776". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.654062 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d7eb0acf-dfc4-4c24-8231-bfae5b620653" (UID: "d7eb0acf-dfc4-4c24-8231-bfae5b620653"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.750907 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt776\" (UniqueName: \"kubernetes.io/projected/d7eb0acf-dfc4-4c24-8231-bfae5b620653-kube-api-access-vt776\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.750944 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.750954 4886 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.750964 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.750972 4886 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7eb0acf-dfc4-4c24-8231-bfae5b620653-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.750980 4886 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.750989 4886 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7eb0acf-dfc4-4c24-8231-bfae5b620653-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.851768 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dc04c928-b93c-49a3-a653-f82b5e686da5","Type":"ContainerStarted","Data":"299f1c944b5c2254f62c4b9d1ad7c85c5444476239d2e24312d2b87d231b97eb"} Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.853923 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d44f9f6d-wvkcd_d7eb0acf-dfc4-4c24-8231-bfae5b620653/console/0.log" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.853974 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d44f9f6d-wvkcd" event={"ID":"d7eb0acf-dfc4-4c24-8231-bfae5b620653","Type":"ContainerDied","Data":"2dde3f8777f56361bbc961c320b3499545e524fdb56d2e7e1762b3c549f1e8ca"} Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.853999 4886 scope.go:117] "RemoveContainer" containerID="83d754bde6259c4ef4756a1b0a86efc202f6d81cccfa70e563b1ad9cae41b68f" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.854095 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d44f9f6d-wvkcd" Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.890766 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d44f9f6d-wvkcd"] Jan 29 17:04:09 crc kubenswrapper[4886]: I0129 17:04:09.902743 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d44f9f6d-wvkcd"] Jan 29 17:04:10 crc kubenswrapper[4886]: E0129 17:04:10.012000 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7eb0acf_dfc4_4c24_8231_bfae5b620653.slice/crio-2dde3f8777f56361bbc961c320b3499545e524fdb56d2e7e1762b3c549f1e8ca\": RecentStats: unable to find data in memory cache]" Jan 29 17:04:10 crc kubenswrapper[4886]: I0129 17:04:10.630379 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7eb0acf-dfc4-4c24-8231-bfae5b620653" path="/var/lib/kubelet/pods/d7eb0acf-dfc4-4c24-8231-bfae5b620653/volumes" Jan 29 17:04:10 crc kubenswrapper[4886]: I0129 17:04:10.871703 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dc04c928-b93c-49a3-a653-f82b5e686da5","Type":"ContainerStarted","Data":"bc05d345a8c98d624229f73d9cd80f1cb6f8add35043ec8de2ca7a9a4647850e"} Jan 29 17:04:10 crc kubenswrapper[4886]: I0129 17:04:10.873420 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 29 17:04:10 crc kubenswrapper[4886]: I0129 17:04:10.910013 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.927404395 podStartE2EDuration="10.909991702s" podCreationTimestamp="2026-01-29 17:04:00 +0000 UTC" firstStartedPulling="2026-01-29 17:04:01.292130946 +0000 UTC m=+2524.200850228" lastFinishedPulling="2026-01-29 17:04:09.274718263 +0000 UTC m=+2532.183437535" observedRunningTime="2026-01-29 17:04:10.891231415 +0000 UTC m=+2533.799950687" watchObservedRunningTime="2026-01-29 17:04:10.909991702 +0000 UTC m=+2533.818710984" Jan 29 17:04:10 crc kubenswrapper[4886]: I0129 17:04:10.919767 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.078565 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tn5pt"] Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.078926 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" containerName="dnsmasq-dns" containerID="cri-o://85f248c363891313b6dfd3563ffece575be09f0a7b8fb96dd58a65634816d1bc" gracePeriod=10 Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.116587 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-t8rs7"] Jan 29 17:04:11 crc kubenswrapper[4886]: E0129 17:04:11.117050 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" containerName="init" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.117075 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" containerName="init" Jan 29 17:04:11 crc kubenswrapper[4886]: E0129 17:04:11.117108 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7eb0acf-dfc4-4c24-8231-bfae5b620653" containerName="console" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.117116 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7eb0acf-dfc4-4c24-8231-bfae5b620653" containerName="console" Jan 29 17:04:11 crc kubenswrapper[4886]: E0129 17:04:11.117128 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" containerName="dnsmasq-dns" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.117136 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" containerName="dnsmasq-dns" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.117402 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7eb0acf-dfc4-4c24-8231-bfae5b620653" containerName="console" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.117430 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6508ccc6-d71f-449d-bbe1-83270d005815" containerName="dnsmasq-dns" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.118774 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.136408 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t8rs7"] Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.191609 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czcfr\" (UniqueName: \"kubernetes.io/projected/eb212bbc-3071-4fda-968d-b6d3f19996ee-kube-api-access-czcfr\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.191654 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-config\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.191752 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.191777 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.191814 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-dns-svc\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.295795 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.295860 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.295904 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-dns-svc\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.295967 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcfr\" (UniqueName: \"kubernetes.io/projected/eb212bbc-3071-4fda-968d-b6d3f19996ee-kube-api-access-czcfr\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.295987 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-config\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.296772 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-config\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.297450 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.297924 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.298752 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-dns-svc\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.352396 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcfr\" (UniqueName: \"kubernetes.io/projected/eb212bbc-3071-4fda-968d-b6d3f19996ee-kube-api-access-czcfr\") pod \"dnsmasq-dns-698758b865-t8rs7\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.505976 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.825011 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.885225 4886 generic.go:334] "Generic (PLEG): container finished" podID="3748c627-3deb-4b89-acd3-2269f42ba343" containerID="85f248c363891313b6dfd3563ffece575be09f0a7b8fb96dd58a65634816d1bc" exitCode=0 Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.885335 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" event={"ID":"3748c627-3deb-4b89-acd3-2269f42ba343","Type":"ContainerDied","Data":"85f248c363891313b6dfd3563ffece575be09f0a7b8fb96dd58a65634816d1bc"} Jan 29 17:04:11 crc kubenswrapper[4886]: I0129 17:04:11.948674 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.245497 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.252177 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.255514 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.255541 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.255700 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-l9zkf" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.258692 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.270977 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.423415 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-88051746-028d-43a7-b95b-e788ae0f16c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88051746-028d-43a7-b95b-e788ae0f16c4\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.423491 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pwc7\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-kube-api-access-5pwc7\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.423547 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.423625 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-cache\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.423670 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.423701 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-lock\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.525218 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-cache\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.525315 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.525383 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-lock\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.525451 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-88051746-028d-43a7-b95b-e788ae0f16c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88051746-028d-43a7-b95b-e788ae0f16c4\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.525487 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pwc7\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-kube-api-access-5pwc7\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.525536 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.525687 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-cache\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.525984 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-lock\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: E0129 17:04:12.526110 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 17:04:12 crc kubenswrapper[4886]: E0129 17:04:12.526133 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 17:04:12 crc kubenswrapper[4886]: E0129 17:04:12.526172 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift podName:6e2f2c6c-bc32-4a32-ba2c-8954d277ce47 nodeName:}" failed. No retries permitted until 2026-01-29 17:04:13.026156575 +0000 UTC m=+2535.934875847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift") pod "swift-storage-0" (UID: "6e2f2c6c-bc32-4a32-ba2c-8954d277ce47") : configmap "swift-ring-files" not found Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.528456 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.528488 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-88051746-028d-43a7-b95b-e788ae0f16c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88051746-028d-43a7-b95b-e788ae0f16c4\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/426a48f8948db7cb55561ec1b18122536ab9cc087c8ed2a6c2cec3e8d4976eec/globalmount\"" pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.534989 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.544435 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pwc7\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-kube-api-access-5pwc7\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.602128 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-88051746-028d-43a7-b95b-e788ae0f16c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88051746-028d-43a7-b95b-e788ae0f16c4\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.902748 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-r28c8"] Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.913464 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.923000 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.923295 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.923728 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.960066 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dba0c99a-0f14-42bd-8822-ee79fc73ee41","Type":"ContainerStarted","Data":"27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2"} Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.967586 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:04:12 crc kubenswrapper[4886]: I0129 17:04:12.990172 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-r28c8"] Jan 29 17:04:12 crc kubenswrapper[4886]: E0129 17:04:12.993714 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-x6r5m ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-r28c8" podUID="60ecf496-dd57-4ed4-9bbc-2e40f9df4447" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.024260 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s7294"] Jan 29 17:04:13 crc kubenswrapper[4886]: E0129 17:04:13.024790 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" containerName="init" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.024811 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" containerName="init" Jan 29 17:04:13 crc kubenswrapper[4886]: E0129 17:04:13.024832 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" containerName="dnsmasq-dns" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.024838 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" containerName="dnsmasq-dns" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.025005 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" containerName="dnsmasq-dns" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.025714 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.051021 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s7294"] Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.061368 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6zcd\" (UniqueName: \"kubernetes.io/projected/3748c627-3deb-4b89-acd3-2269f42ba343-kube-api-access-x6zcd\") pod \"3748c627-3deb-4b89-acd3-2269f42ba343\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.061426 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-dns-svc\") pod \"3748c627-3deb-4b89-acd3-2269f42ba343\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.061750 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-config\") pod \"3748c627-3deb-4b89-acd3-2269f42ba343\" (UID: \"3748c627-3deb-4b89-acd3-2269f42ba343\") " Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.062512 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.062548 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-ring-data-devices\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.062642 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-scripts\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: E0129 17:04:13.062808 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 17:04:13 crc kubenswrapper[4886]: E0129 17:04:13.062829 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 17:04:13 crc kubenswrapper[4886]: E0129 17:04:13.062892 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift podName:6e2f2c6c-bc32-4a32-ba2c-8954d277ce47 nodeName:}" failed. No retries permitted until 2026-01-29 17:04:14.062865877 +0000 UTC m=+2536.971585149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift") pod "swift-storage-0" (UID: "6e2f2c6c-bc32-4a32-ba2c-8954d277ce47") : configmap "swift-ring-files" not found Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.062943 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-dispersionconf\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.063098 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-swiftconf\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.063176 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-etc-swift\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.063280 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6r5m\" (UniqueName: \"kubernetes.io/projected/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-kube-api-access-x6r5m\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.063318 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-combined-ca-bundle\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.069800 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3748c627-3deb-4b89-acd3-2269f42ba343-kube-api-access-x6zcd" (OuterVolumeSpecName: "kube-api-access-x6zcd") pod "3748c627-3deb-4b89-acd3-2269f42ba343" (UID: "3748c627-3deb-4b89-acd3-2269f42ba343"). InnerVolumeSpecName "kube-api-access-x6zcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.078911 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-r28c8"] Jan 29 17:04:13 crc kubenswrapper[4886]: W0129 17:04:13.100354 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb212bbc_3071_4fda_968d_b6d3f19996ee.slice/crio-da2d61dccf59424cc14b54a614d36ae066f9a9d76b8f120a8702b08ed1b7f949 WatchSource:0}: Error finding container da2d61dccf59424cc14b54a614d36ae066f9a9d76b8f120a8702b08ed1b7f949: Status 404 returned error can't find the container with id da2d61dccf59424cc14b54a614d36ae066f9a9d76b8f120a8702b08ed1b7f949 Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.122304 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t8rs7"] Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.164905 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3748c627-3deb-4b89-acd3-2269f42ba343" (UID: "3748c627-3deb-4b89-acd3-2269f42ba343"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.168961 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-config" (OuterVolumeSpecName: "config") pod "3748c627-3deb-4b89-acd3-2269f42ba343" (UID: "3748c627-3deb-4b89-acd3-2269f42ba343"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173554 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-swiftconf\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173632 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-etc-swift\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173665 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6r5m\" (UniqueName: \"kubernetes.io/projected/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-kube-api-access-x6r5m\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173699 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-dispersionconf\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173717 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-combined-ca-bundle\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173740 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-scripts\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173773 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-ring-data-devices\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173790 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-scripts\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173897 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9gr\" (UniqueName: \"kubernetes.io/projected/ebccb3a0-d421-4c30-9201-43e9106e4006-kube-api-access-km9gr\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.173952 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-combined-ca-bundle\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174013 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-dispersionconf\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174051 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebccb3a0-d421-4c30-9201-43e9106e4006-etc-swift\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174153 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-ring-data-devices\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174189 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-swiftconf\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174273 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174296 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6zcd\" (UniqueName: \"kubernetes.io/projected/3748c627-3deb-4b89-acd3-2269f42ba343-kube-api-access-x6zcd\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174312 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3748c627-3deb-4b89-acd3-2269f42ba343-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174709 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-scripts\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.174962 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-etc-swift\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.175120 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-ring-data-devices\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.177707 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-swiftconf\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.178686 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-dispersionconf\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.184990 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-combined-ca-bundle\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.194073 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6r5m\" (UniqueName: \"kubernetes.io/projected/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-kube-api-access-x6r5m\") pod \"swift-ring-rebalance-r28c8\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.275899 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-dispersionconf\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.275953 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-scripts\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.276024 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km9gr\" (UniqueName: \"kubernetes.io/projected/ebccb3a0-d421-4c30-9201-43e9106e4006-kube-api-access-km9gr\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.276050 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-combined-ca-bundle\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.276091 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebccb3a0-d421-4c30-9201-43e9106e4006-etc-swift\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.276142 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-ring-data-devices\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.276161 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-swiftconf\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.276872 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebccb3a0-d421-4c30-9201-43e9106e4006-etc-swift\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.277026 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-scripts\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.277388 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-ring-data-devices\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.279736 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-swiftconf\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.282710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-dispersionconf\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.285485 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-combined-ca-bundle\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.291653 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9gr\" (UniqueName: \"kubernetes.io/projected/ebccb3a0-d421-4c30-9201-43e9106e4006-kube-api-access-km9gr\") pod \"swift-ring-rebalance-s7294\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.341471 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.619639 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:04:13 crc kubenswrapper[4886]: E0129 17:04:13.620202 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.815086 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s7294"] Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.978934 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.979039 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tn5pt" event={"ID":"3748c627-3deb-4b89-acd3-2269f42ba343","Type":"ContainerDied","Data":"5ab6a774b30c4926836ad5d20a9d8ca3a61ba5556b7b5bbd72dc9a90a6ac1502"} Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.979644 4886 scope.go:117] "RemoveContainer" containerID="85f248c363891313b6dfd3563ffece575be09f0a7b8fb96dd58a65634816d1bc" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.982638 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s7294" event={"ID":"ebccb3a0-d421-4c30-9201-43e9106e4006","Type":"ContainerStarted","Data":"b1f9445ba0ed2622eaf729acf0f6efe1278fbfe9cc96bab1babb0686d7460824"} Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.989052 4886 generic.go:334] "Generic (PLEG): container finished" podID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerID="71b921e8db9e8e747c69aeafc44470b62e0400a32e8c7e760d1d991c175cbc64" exitCode=0 Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.989143 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.989414 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t8rs7" event={"ID":"eb212bbc-3071-4fda-968d-b6d3f19996ee","Type":"ContainerDied","Data":"71b921e8db9e8e747c69aeafc44470b62e0400a32e8c7e760d1d991c175cbc64"} Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.989480 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t8rs7" event={"ID":"eb212bbc-3071-4fda-968d-b6d3f19996ee","Type":"ContainerStarted","Data":"da2d61dccf59424cc14b54a614d36ae066f9a9d76b8f120a8702b08ed1b7f949"} Jan 29 17:04:13 crc kubenswrapper[4886]: I0129 17:04:13.990547 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.017859 4886 scope.go:117] "RemoveContainer" containerID="fcac16ce7b565761d87666d9cf26f0b7bab43d40d9fedf5938d903160f00e164" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.041056 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=65.8879652 podStartE2EDuration="1m44.041035547s" podCreationTimestamp="2026-01-29 17:02:30 +0000 UTC" firstStartedPulling="2026-01-29 17:03:34.369983163 +0000 UTC m=+2497.278702435" lastFinishedPulling="2026-01-29 17:04:12.52305351 +0000 UTC m=+2535.431772782" observedRunningTime="2026-01-29 17:04:14.037239153 +0000 UTC m=+2536.945958415" watchObservedRunningTime="2026-01-29 17:04:14.041035547 +0000 UTC m=+2536.949754809" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.095173 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:14 crc kubenswrapper[4886]: E0129 17:04:14.095369 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 17:04:14 crc kubenswrapper[4886]: E0129 17:04:14.095387 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 17:04:14 crc kubenswrapper[4886]: E0129 17:04:14.095442 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift podName:6e2f2c6c-bc32-4a32-ba2c-8954d277ce47 nodeName:}" failed. No retries permitted until 2026-01-29 17:04:16.095424015 +0000 UTC m=+2539.004143287 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift") pod "swift-storage-0" (UID: "6e2f2c6c-bc32-4a32-ba2c-8954d277ce47") : configmap "swift-ring-files" not found Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.110188 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.178869 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tn5pt"] Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.186891 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tn5pt"] Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.196998 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-combined-ca-bundle\") pod \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197113 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-swiftconf\") pod \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197154 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6r5m\" (UniqueName: \"kubernetes.io/projected/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-kube-api-access-x6r5m\") pod \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197190 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-scripts\") pod \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197218 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-ring-data-devices\") pod \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197357 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-dispersionconf\") pod \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197412 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-etc-swift\") pod \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\" (UID: \"60ecf496-dd57-4ed4-9bbc-2e40f9df4447\") " Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197739 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-scripts" (OuterVolumeSpecName: "scripts") pod "60ecf496-dd57-4ed4-9bbc-2e40f9df4447" (UID: "60ecf496-dd57-4ed4-9bbc-2e40f9df4447"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197935 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "60ecf496-dd57-4ed4-9bbc-2e40f9df4447" (UID: "60ecf496-dd57-4ed4-9bbc-2e40f9df4447"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.197944 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "60ecf496-dd57-4ed4-9bbc-2e40f9df4447" (UID: "60ecf496-dd57-4ed4-9bbc-2e40f9df4447"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.198363 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.198380 4886 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.198392 4886 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.202898 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60ecf496-dd57-4ed4-9bbc-2e40f9df4447" (UID: "60ecf496-dd57-4ed4-9bbc-2e40f9df4447"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.203039 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-kube-api-access-x6r5m" (OuterVolumeSpecName: "kube-api-access-x6r5m") pod "60ecf496-dd57-4ed4-9bbc-2e40f9df4447" (UID: "60ecf496-dd57-4ed4-9bbc-2e40f9df4447"). InnerVolumeSpecName "kube-api-access-x6r5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.203017 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "60ecf496-dd57-4ed4-9bbc-2e40f9df4447" (UID: "60ecf496-dd57-4ed4-9bbc-2e40f9df4447"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.203156 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "60ecf496-dd57-4ed4-9bbc-2e40f9df4447" (UID: "60ecf496-dd57-4ed4-9bbc-2e40f9df4447"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.300659 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.300695 4886 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.300709 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6r5m\" (UniqueName: \"kubernetes.io/projected/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-kube-api-access-x6r5m\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.300722 4886 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60ecf496-dd57-4ed4-9bbc-2e40f9df4447-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:14 crc kubenswrapper[4886]: I0129 17:04:14.634492 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3748c627-3deb-4b89-acd3-2269f42ba343" path="/var/lib/kubelet/pods/3748c627-3deb-4b89-acd3-2269f42ba343/volumes" Jan 29 17:04:15 crc kubenswrapper[4886]: I0129 17:04:15.000477 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-r28c8" Jan 29 17:04:15 crc kubenswrapper[4886]: I0129 17:04:15.047166 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-r28c8"] Jan 29 17:04:15 crc kubenswrapper[4886]: I0129 17:04:15.077477 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-r28c8"] Jan 29 17:04:16 crc kubenswrapper[4886]: I0129 17:04:16.143556 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:16 crc kubenswrapper[4886]: E0129 17:04:16.143749 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 17:04:16 crc kubenswrapper[4886]: E0129 17:04:16.143998 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 17:04:16 crc kubenswrapper[4886]: E0129 17:04:16.144050 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift podName:6e2f2c6c-bc32-4a32-ba2c-8954d277ce47 nodeName:}" failed. No retries permitted until 2026-01-29 17:04:20.144032799 +0000 UTC m=+2543.052752071 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift") pod "swift-storage-0" (UID: "6e2f2c6c-bc32-4a32-ba2c-8954d277ce47") : configmap "swift-ring-files" not found Jan 29 17:04:16 crc kubenswrapper[4886]: I0129 17:04:16.638816 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ecf496-dd57-4ed4-9bbc-2e40f9df4447" path="/var/lib/kubelet/pods/60ecf496-dd57-4ed4-9bbc-2e40f9df4447/volumes" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.017805 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t8rs7" event={"ID":"eb212bbc-3071-4fda-968d-b6d3f19996ee","Type":"ContainerStarted","Data":"54bdeb43a338f0b719b206ca212f50bc02c6d2592ec0ac66c6b8743631a3cf1b"} Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.043790 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-v692m"] Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.045027 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v692m" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.048051 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.060317 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-v692m"] Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.168080 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-operator-scripts\") pod \"root-account-create-update-v692m\" (UID: \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\") " pod="openstack/root-account-create-update-v692m" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.168148 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrkn\" (UniqueName: \"kubernetes.io/projected/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-kube-api-access-gjrkn\") pod \"root-account-create-update-v692m\" (UID: \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\") " pod="openstack/root-account-create-update-v692m" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.270517 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-operator-scripts\") pod \"root-account-create-update-v692m\" (UID: \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\") " pod="openstack/root-account-create-update-v692m" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.270577 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrkn\" (UniqueName: \"kubernetes.io/projected/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-kube-api-access-gjrkn\") pod \"root-account-create-update-v692m\" (UID: \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\") " pod="openstack/root-account-create-update-v692m" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.271710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-operator-scripts\") pod \"root-account-create-update-v692m\" (UID: \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\") " pod="openstack/root-account-create-update-v692m" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.292098 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrkn\" (UniqueName: \"kubernetes.io/projected/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-kube-api-access-gjrkn\") pod \"root-account-create-update-v692m\" (UID: \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\") " pod="openstack/root-account-create-update-v692m" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.363222 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v692m" Jan 29 17:04:17 crc kubenswrapper[4886]: I0129 17:04:17.863835 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-v692m"] Jan 29 17:04:17 crc kubenswrapper[4886]: W0129 17:04:17.865113 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a29ba47_9a94_492f_8abd_c01b04d0b3c1.slice/crio-64c66fbc90bf20316435457059ddb5ea811599c8b622e4c863e62edddb2ed230 WatchSource:0}: Error finding container 64c66fbc90bf20316435457059ddb5ea811599c8b622e4c863e62edddb2ed230: Status 404 returned error can't find the container with id 64c66fbc90bf20316435457059ddb5ea811599c8b622e4c863e62edddb2ed230 Jan 29 17:04:18 crc kubenswrapper[4886]: I0129 17:04:18.030969 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v692m" event={"ID":"7a29ba47-9a94-492f-8abd-c01b04d0b3c1","Type":"ContainerStarted","Data":"64c66fbc90bf20316435457059ddb5ea811599c8b622e4c863e62edddb2ed230"} Jan 29 17:04:18 crc kubenswrapper[4886]: I0129 17:04:18.031116 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:18 crc kubenswrapper[4886]: I0129 17:04:18.067078 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-t8rs7" podStartSLOduration=7.067056323 podStartE2EDuration="7.067056323s" podCreationTimestamp="2026-01-29 17:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:18.061213162 +0000 UTC m=+2540.969932434" watchObservedRunningTime="2026-01-29 17:04:18.067056323 +0000 UTC m=+2540.975775595" Jan 29 17:04:20 crc kubenswrapper[4886]: I0129 17:04:20.148940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:20 crc kubenswrapper[4886]: E0129 17:04:20.149187 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 17:04:20 crc kubenswrapper[4886]: E0129 17:04:20.149633 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 17:04:20 crc kubenswrapper[4886]: E0129 17:04:20.149694 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift podName:6e2f2c6c-bc32-4a32-ba2c-8954d277ce47 nodeName:}" failed. No retries permitted until 2026-01-29 17:04:28.149674113 +0000 UTC m=+2551.058393385 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift") pod "swift-storage-0" (UID: "6e2f2c6c-bc32-4a32-ba2c-8954d277ce47") : configmap "swift-ring-files" not found Jan 29 17:04:20 crc kubenswrapper[4886]: I0129 17:04:20.656439 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 29 17:04:21 crc kubenswrapper[4886]: I0129 17:04:21.057045 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v692m" event={"ID":"7a29ba47-9a94-492f-8abd-c01b04d0b3c1","Type":"ContainerStarted","Data":"8d073617833fd03b3552145f85acbb902d34a0687d97b69de74b719dca519779"} Jan 29 17:04:21 crc kubenswrapper[4886]: I0129 17:04:21.078595 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-v692m" podStartSLOduration=4.078577416 podStartE2EDuration="4.078577416s" podCreationTimestamp="2026-01-29 17:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:21.076733716 +0000 UTC m=+2543.985452998" watchObservedRunningTime="2026-01-29 17:04:21.078577416 +0000 UTC m=+2543.987296688" Jan 29 17:04:21 crc kubenswrapper[4886]: I0129 17:04:21.107464 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 29 17:04:21 crc kubenswrapper[4886]: I0129 17:04:21.507976 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:04:21 crc kubenswrapper[4886]: I0129 17:04:21.598608 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-29gw9"] Jan 29 17:04:21 crc kubenswrapper[4886]: I0129 17:04:21.598853 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" podUID="4ef7b166-c078-4530-b05b-ae3e44088122" containerName="dnsmasq-dns" containerID="cri-o://e0d2fbb581e1f1576641f1d25760b3a9a9b2fc1c9e7db710f6875c72957b1c0b" gracePeriod=10 Jan 29 17:04:23 crc kubenswrapper[4886]: I0129 17:04:23.076838 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ef7b166-c078-4530-b05b-ae3e44088122" containerID="e0d2fbb581e1f1576641f1d25760b3a9a9b2fc1c9e7db710f6875c72957b1c0b" exitCode=0 Jan 29 17:04:23 crc kubenswrapper[4886]: I0129 17:04:23.076925 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" event={"ID":"4ef7b166-c078-4530-b05b-ae3e44088122","Type":"ContainerDied","Data":"e0d2fbb581e1f1576641f1d25760b3a9a9b2fc1c9e7db710f6875c72957b1c0b"} Jan 29 17:04:23 crc kubenswrapper[4886]: I0129 17:04:23.822171 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 29 17:04:23 crc kubenswrapper[4886]: I0129 17:04:23.916274 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.043405 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-b7d9p" podUID="544b4515-481c-47f1-acb6-ed332a3497d4" containerName="ovn-controller" probeResult="failure" output=< Jan 29 17:04:24 crc kubenswrapper[4886]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 29 17:04:24 crc kubenswrapper[4886]: > Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.092043 4886 generic.go:334] "Generic (PLEG): container finished" podID="9d0db9ae-746b-419a-bc61-bf85645d2bff" containerID="90c62e1af999c12bd3cee48206c3c037d5e41331e61dd2c2d6e99f50a71acbba" exitCode=0 Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.092102 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9d0db9ae-746b-419a-bc61-bf85645d2bff","Type":"ContainerDied","Data":"90c62e1af999c12bd3cee48206c3c037d5e41331e61dd2c2d6e99f50a71acbba"} Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.096019 4886 generic.go:334] "Generic (PLEG): container finished" podID="842bfe4d-04ba-4143-9076-3033163c7b82" containerID="5c98fb62cf57fb19a685fed0c362721e82c04b5d528f5ad7579c1412f1f79e81" exitCode=0 Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.096089 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"842bfe4d-04ba-4143-9076-3033163c7b82","Type":"ContainerDied","Data":"5c98fb62cf57fb19a685fed0c362721e82c04b5d528f5ad7579c1412f1f79e81"} Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.108587 4886 generic.go:334] "Generic (PLEG): container finished" podID="49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10" containerID="e164b2712bb12971248661528d0d661417a2f6869697cd179a3843bd4e2721f1" exitCode=0 Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.108661 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10","Type":"ContainerDied","Data":"e164b2712bb12971248661528d0d661417a2f6869697cd179a3843bd4e2721f1"} Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.114197 4886 generic.go:334] "Generic (PLEG): container finished" podID="2b0be43b-8956-45aa-ad50-de9183b3fea3" containerID="121b418980e461ff82cc0059422b3aec6e494e5fd4c123ffbab962202999757c" exitCode=0 Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.115142 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b0be43b-8956-45aa-ad50-de9183b3fea3","Type":"ContainerDied","Data":"121b418980e461ff82cc0059422b3aec6e494e5fd4c123ffbab962202999757c"} Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.161463 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f0b5-account-create-update-8b8vz"] Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.163644 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.165705 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.171942 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.180479 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f0b5-account-create-update-8b8vz"] Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.245980 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29921ec8-f68f-4547-a2c0-d4d3f5de6960-operator-scripts\") pod \"glance-f0b5-account-create-update-8b8vz\" (UID: \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\") " pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.246355 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxbwc\" (UniqueName: \"kubernetes.io/projected/29921ec8-f68f-4547-a2c0-d4d3f5de6960-kube-api-access-pxbwc\") pod \"glance-f0b5-account-create-update-8b8vz\" (UID: \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\") " pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.352530 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29921ec8-f68f-4547-a2c0-d4d3f5de6960-operator-scripts\") pod \"glance-f0b5-account-create-update-8b8vz\" (UID: \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\") " pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.352868 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxbwc\" (UniqueName: \"kubernetes.io/projected/29921ec8-f68f-4547-a2c0-d4d3f5de6960-kube-api-access-pxbwc\") pod \"glance-f0b5-account-create-update-8b8vz\" (UID: \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\") " pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.353440 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29921ec8-f68f-4547-a2c0-d4d3f5de6960-operator-scripts\") pod \"glance-f0b5-account-create-update-8b8vz\" (UID: \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\") " pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.408014 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxbwc\" (UniqueName: \"kubernetes.io/projected/29921ec8-f68f-4547-a2c0-d4d3f5de6960-kube-api-access-pxbwc\") pod \"glance-f0b5-account-create-update-8b8vz\" (UID: \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\") " pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.503828 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.513484 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mdvpb"] Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.515534 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.538550 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mdvpb"] Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.560083 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-operator-scripts\") pod \"glance-db-create-mdvpb\" (UID: \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\") " pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.560118 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2mjv\" (UniqueName: \"kubernetes.io/projected/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-kube-api-access-s2mjv\") pod \"glance-db-create-mdvpb\" (UID: \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\") " pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.661658 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-operator-scripts\") pod \"glance-db-create-mdvpb\" (UID: \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\") " pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.661712 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2mjv\" (UniqueName: \"kubernetes.io/projected/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-kube-api-access-s2mjv\") pod \"glance-db-create-mdvpb\" (UID: \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\") " pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.662566 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-operator-scripts\") pod \"glance-db-create-mdvpb\" (UID: \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\") " pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.678483 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2mjv\" (UniqueName: \"kubernetes.io/projected/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-kube-api-access-s2mjv\") pod \"glance-db-create-mdvpb\" (UID: \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\") " pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:24 crc kubenswrapper[4886]: I0129 17:04:24.848821 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:25 crc kubenswrapper[4886]: I0129 17:04:25.135887 4886 generic.go:334] "Generic (PLEG): container finished" podID="7a29ba47-9a94-492f-8abd-c01b04d0b3c1" containerID="8d073617833fd03b3552145f85acbb902d34a0687d97b69de74b719dca519779" exitCode=0 Jan 29 17:04:25 crc kubenswrapper[4886]: I0129 17:04:25.135934 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v692m" event={"ID":"7a29ba47-9a94-492f-8abd-c01b04d0b3c1","Type":"ContainerDied","Data":"8d073617833fd03b3552145f85acbb902d34a0687d97b69de74b719dca519779"} Jan 29 17:04:26 crc kubenswrapper[4886]: E0129 17:04:26.398178 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741" Jan 29 17:04:26 crc kubenswrapper[4886]: E0129 17:04:26.398991 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-1,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-2,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w2cnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(ce7955a1-eb58-425a-872a-7ec102b8e090): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.531878 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.601063 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-nb\") pod \"4ef7b166-c078-4530-b05b-ae3e44088122\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.601137 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-sb\") pod \"4ef7b166-c078-4530-b05b-ae3e44088122\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.601188 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-dns-svc\") pod \"4ef7b166-c078-4530-b05b-ae3e44088122\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.601244 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-config\") pod \"4ef7b166-c078-4530-b05b-ae3e44088122\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.601359 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5gl6\" (UniqueName: \"kubernetes.io/projected/4ef7b166-c078-4530-b05b-ae3e44088122-kube-api-access-h5gl6\") pod \"4ef7b166-c078-4530-b05b-ae3e44088122\" (UID: \"4ef7b166-c078-4530-b05b-ae3e44088122\") " Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.607186 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef7b166-c078-4530-b05b-ae3e44088122-kube-api-access-h5gl6" (OuterVolumeSpecName: "kube-api-access-h5gl6") pod "4ef7b166-c078-4530-b05b-ae3e44088122" (UID: "4ef7b166-c078-4530-b05b-ae3e44088122"). InnerVolumeSpecName "kube-api-access-h5gl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.654268 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ef7b166-c078-4530-b05b-ae3e44088122" (UID: "4ef7b166-c078-4530-b05b-ae3e44088122"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.662284 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-config" (OuterVolumeSpecName: "config") pod "4ef7b166-c078-4530-b05b-ae3e44088122" (UID: "4ef7b166-c078-4530-b05b-ae3e44088122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.677263 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ef7b166-c078-4530-b05b-ae3e44088122" (UID: "4ef7b166-c078-4530-b05b-ae3e44088122"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.697614 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ef7b166-c078-4530-b05b-ae3e44088122" (UID: "4ef7b166-c078-4530-b05b-ae3e44088122"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.703682 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.703718 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.703734 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.703747 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef7b166-c078-4530-b05b-ae3e44088122-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:26 crc kubenswrapper[4886]: I0129 17:04:26.703759 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5gl6\" (UniqueName: \"kubernetes.io/projected/4ef7b166-c078-4530-b05b-ae3e44088122-kube-api-access-h5gl6\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.154574 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" event={"ID":"4ef7b166-c078-4530-b05b-ae3e44088122","Type":"ContainerDied","Data":"b30007dc7ac0cb559fa26a9b1b3904c3d91b03c66e5d4e617cb72bf920854daa"} Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.154998 4886 scope.go:117] "RemoveContainer" containerID="e0d2fbb581e1f1576641f1d25760b3a9a9b2fc1c9e7db710f6875c72957b1c0b" Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.154667 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.189272 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-29gw9"] Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.197576 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-29gw9"] Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.644744 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v692m" Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.723855 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-operator-scripts\") pod \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\" (UID: \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\") " Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.724016 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjrkn\" (UniqueName: \"kubernetes.io/projected/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-kube-api-access-gjrkn\") pod \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\" (UID: \"7a29ba47-9a94-492f-8abd-c01b04d0b3c1\") " Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.725083 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a29ba47-9a94-492f-8abd-c01b04d0b3c1" (UID: "7a29ba47-9a94-492f-8abd-c01b04d0b3c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.725716 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.744674 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-kube-api-access-gjrkn" (OuterVolumeSpecName: "kube-api-access-gjrkn") pod "7a29ba47-9a94-492f-8abd-c01b04d0b3c1" (UID: "7a29ba47-9a94-492f-8abd-c01b04d0b3c1"). InnerVolumeSpecName "kube-api-access-gjrkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:27 crc kubenswrapper[4886]: I0129 17:04:27.828759 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjrkn\" (UniqueName: \"kubernetes.io/projected/7a29ba47-9a94-492f-8abd-c01b04d0b3c1-kube-api-access-gjrkn\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.054225 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-sgspp"] Jan 29 17:04:28 crc kubenswrapper[4886]: E0129 17:04:28.054865 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef7b166-c078-4530-b05b-ae3e44088122" containerName="init" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.054883 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef7b166-c078-4530-b05b-ae3e44088122" containerName="init" Jan 29 17:04:28 crc kubenswrapper[4886]: E0129 17:04:28.054896 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a29ba47-9a94-492f-8abd-c01b04d0b3c1" containerName="mariadb-account-create-update" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.054904 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a29ba47-9a94-492f-8abd-c01b04d0b3c1" containerName="mariadb-account-create-update" Jan 29 17:04:28 crc kubenswrapper[4886]: E0129 17:04:28.054922 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef7b166-c078-4530-b05b-ae3e44088122" containerName="dnsmasq-dns" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.054930 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef7b166-c078-4530-b05b-ae3e44088122" containerName="dnsmasq-dns" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.055192 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a29ba47-9a94-492f-8abd-c01b04d0b3c1" containerName="mariadb-account-create-update" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.055213 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef7b166-c078-4530-b05b-ae3e44088122" containerName="dnsmasq-dns" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.056193 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.063074 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sgspp"] Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.140607 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hc79\" (UniqueName: \"kubernetes.io/projected/b696cd6b-840b-4505-9010-114d223a90e9-kube-api-access-8hc79\") pod \"keystone-db-create-sgspp\" (UID: \"b696cd6b-840b-4505-9010-114d223a90e9\") " pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.140898 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b696cd6b-840b-4505-9010-114d223a90e9-operator-scripts\") pod \"keystone-db-create-sgspp\" (UID: \"b696cd6b-840b-4505-9010-114d223a90e9\") " pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.170385 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v692m" event={"ID":"7a29ba47-9a94-492f-8abd-c01b04d0b3c1","Type":"ContainerDied","Data":"64c66fbc90bf20316435457059ddb5ea811599c8b622e4c863e62edddb2ed230"} Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.170455 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64c66fbc90bf20316435457059ddb5ea811599c8b622e4c863e62edddb2ed230" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.170553 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v692m" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.181553 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-00e3-account-create-update-5hhsj"] Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.183180 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.185878 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.205987 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-00e3-account-create-update-5hhsj"] Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.242870 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b696cd6b-840b-4505-9010-114d223a90e9-operator-scripts\") pod \"keystone-db-create-sgspp\" (UID: \"b696cd6b-840b-4505-9010-114d223a90e9\") " pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.242951 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hc79\" (UniqueName: \"kubernetes.io/projected/b696cd6b-840b-4505-9010-114d223a90e9-kube-api-access-8hc79\") pod \"keystone-db-create-sgspp\" (UID: \"b696cd6b-840b-4505-9010-114d223a90e9\") " pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.243010 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m47b2\" (UniqueName: \"kubernetes.io/projected/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-kube-api-access-m47b2\") pod \"keystone-00e3-account-create-update-5hhsj\" (UID: \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\") " pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.243069 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.243120 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-operator-scripts\") pod \"keystone-00e3-account-create-update-5hhsj\" (UID: \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\") " pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:28 crc kubenswrapper[4886]: E0129 17:04:28.243504 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 17:04:28 crc kubenswrapper[4886]: E0129 17:04:28.243541 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 17:04:28 crc kubenswrapper[4886]: E0129 17:04:28.243599 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift podName:6e2f2c6c-bc32-4a32-ba2c-8954d277ce47 nodeName:}" failed. No retries permitted until 2026-01-29 17:04:44.243578396 +0000 UTC m=+2567.152297688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift") pod "swift-storage-0" (UID: "6e2f2c6c-bc32-4a32-ba2c-8954d277ce47") : configmap "swift-ring-files" not found Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.244453 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b696cd6b-840b-4505-9010-114d223a90e9-operator-scripts\") pod \"keystone-db-create-sgspp\" (UID: \"b696cd6b-840b-4505-9010-114d223a90e9\") " pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.289382 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hc79\" (UniqueName: \"kubernetes.io/projected/b696cd6b-840b-4505-9010-114d223a90e9-kube-api-access-8hc79\") pod \"keystone-db-create-sgspp\" (UID: \"b696cd6b-840b-4505-9010-114d223a90e9\") " pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.345437 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m47b2\" (UniqueName: \"kubernetes.io/projected/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-kube-api-access-m47b2\") pod \"keystone-00e3-account-create-update-5hhsj\" (UID: \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\") " pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.345535 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-operator-scripts\") pod \"keystone-00e3-account-create-update-5hhsj\" (UID: \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\") " pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.346236 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-operator-scripts\") pod \"keystone-00e3-account-create-update-5hhsj\" (UID: \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\") " pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.386768 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4vq4n"] Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.388434 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.404983 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m47b2\" (UniqueName: \"kubernetes.io/projected/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-kube-api-access-m47b2\") pod \"keystone-00e3-account-create-update-5hhsj\" (UID: \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\") " pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.410663 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4vq4n"] Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.445195 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.446596 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-operator-scripts\") pod \"placement-db-create-4vq4n\" (UID: \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\") " pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.446790 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8547\" (UniqueName: \"kubernetes.io/projected/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-kube-api-access-n8547\") pod \"placement-db-create-4vq4n\" (UID: \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\") " pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.505501 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d860-account-create-update-5kd66"] Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.506881 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.508190 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.509684 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.514590 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d860-account-create-update-5kd66"] Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.548756 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhjq\" (UniqueName: \"kubernetes.io/projected/66c16915-30cc-4a4f-81ff-4b82cf152968-kube-api-access-lzhjq\") pod \"placement-d860-account-create-update-5kd66\" (UID: \"66c16915-30cc-4a4f-81ff-4b82cf152968\") " pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.548862 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8547\" (UniqueName: \"kubernetes.io/projected/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-kube-api-access-n8547\") pod \"placement-db-create-4vq4n\" (UID: \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\") " pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.549094 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-operator-scripts\") pod \"placement-db-create-4vq4n\" (UID: \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\") " pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.549133 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c16915-30cc-4a4f-81ff-4b82cf152968-operator-scripts\") pod \"placement-d860-account-create-update-5kd66\" (UID: \"66c16915-30cc-4a4f-81ff-4b82cf152968\") " pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.549926 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-operator-scripts\") pod \"placement-db-create-4vq4n\" (UID: \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\") " pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.572982 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8547\" (UniqueName: \"kubernetes.io/projected/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-kube-api-access-n8547\") pod \"placement-db-create-4vq4n\" (UID: \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\") " pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.621128 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:04:28 crc kubenswrapper[4886]: E0129 17:04:28.621474 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.627960 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef7b166-c078-4530-b05b-ae3e44088122" path="/var/lib/kubelet/pods/4ef7b166-c078-4530-b05b-ae3e44088122/volumes" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.651125 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhjq\" (UniqueName: \"kubernetes.io/projected/66c16915-30cc-4a4f-81ff-4b82cf152968-kube-api-access-lzhjq\") pod \"placement-d860-account-create-update-5kd66\" (UID: \"66c16915-30cc-4a4f-81ff-4b82cf152968\") " pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.651446 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c16915-30cc-4a4f-81ff-4b82cf152968-operator-scripts\") pod \"placement-d860-account-create-update-5kd66\" (UID: \"66c16915-30cc-4a4f-81ff-4b82cf152968\") " pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.652776 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c16915-30cc-4a4f-81ff-4b82cf152968-operator-scripts\") pod \"placement-d860-account-create-update-5kd66\" (UID: \"66c16915-30cc-4a4f-81ff-4b82cf152968\") " pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.669053 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhjq\" (UniqueName: \"kubernetes.io/projected/66c16915-30cc-4a4f-81ff-4b82cf152968-kube-api-access-lzhjq\") pod \"placement-d860-account-create-update-5kd66\" (UID: \"66c16915-30cc-4a4f-81ff-4b82cf152968\") " pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.753246 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:28 crc kubenswrapper[4886]: I0129 17:04:28.832677 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.045106 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-b7d9p" podUID="544b4515-481c-47f1-acb6-ed332a3497d4" containerName="ovn-controller" probeResult="failure" output=< Jan 29 17:04:29 crc kubenswrapper[4886]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 29 17:04:29 crc kubenswrapper[4886]: > Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.150116 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xhds2" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.385129 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b7d9p-config-fbd7w"] Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.387230 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.391217 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.402781 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b7d9p-config-fbd7w"] Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.466694 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-additional-scripts\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.466741 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run-ovn\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.466785 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-log-ovn\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.466902 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7nwl\" (UniqueName: \"kubernetes.io/projected/e489f203-c94a-4bbb-b22a-750bec963d77-kube-api-access-j7nwl\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.466969 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-scripts\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.467098 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.577383 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-log-ovn\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.577480 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7nwl\" (UniqueName: \"kubernetes.io/projected/e489f203-c94a-4bbb-b22a-750bec963d77-kube-api-access-j7nwl\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.577981 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-log-ovn\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.578597 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-scripts\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.579168 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.579352 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-additional-scripts\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.579391 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run-ovn\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.579346 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.579739 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run-ovn\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.580211 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-additional-scripts\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.582179 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-scripts\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.602670 4886 scope.go:117] "RemoveContainer" containerID="cbbe07486135ddfe120920c1f4f9ccadece896cbebac702a4fee9f0d2022f4db" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.609228 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7nwl\" (UniqueName: \"kubernetes.io/projected/e489f203-c94a-4bbb-b22a-750bec963d77-kube-api-access-j7nwl\") pod \"ovn-controller-b7d9p-config-fbd7w\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:29 crc kubenswrapper[4886]: I0129 17:04:29.718112 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.278688 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f0b5-account-create-update-8b8vz"] Jan 29 17:04:30 crc kubenswrapper[4886]: W0129 17:04:30.282199 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29921ec8_f68f_4547_a2c0_d4d3f5de6960.slice/crio-e3585e24c6e310ab66cc3acdb8b7196a729aef835b23a64db0aa1d39659b162c WatchSource:0}: Error finding container e3585e24c6e310ab66cc3acdb8b7196a729aef835b23a64db0aa1d39659b162c: Status 404 returned error can't find the container with id e3585e24c6e310ab66cc3acdb8b7196a729aef835b23a64db0aa1d39659b162c Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.375809 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-v692m"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.396714 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-v692m"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.524006 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d860-account-create-update-5kd66"] Jan 29 17:04:30 crc kubenswrapper[4886]: W0129 17:04:30.528751 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c4e1c71_a857_4feb_8778_ba3aa8b7dbfe.slice/crio-b50b1c67e2972d88bd8981e1a3db87ee14511c02cd94a92c47a372ec32761177 WatchSource:0}: Error finding container b50b1c67e2972d88bd8981e1a3db87ee14511c02cd94a92c47a372ec32761177: Status 404 returned error can't find the container with id b50b1c67e2972d88bd8981e1a3db87ee14511c02cd94a92c47a372ec32761177 Jan 29 17:04:30 crc kubenswrapper[4886]: W0129 17:04:30.534098 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa302a57_5c6b_41b1_ac4b_7d9095b7b65a.slice/crio-75581e1d16d26560497cc9988813329216f56a92bcacbc7cddb3b31eef34be95 WatchSource:0}: Error finding container 75581e1d16d26560497cc9988813329216f56a92bcacbc7cddb3b31eef34be95: Status 404 returned error can't find the container with id 75581e1d16d26560497cc9988813329216f56a92bcacbc7cddb3b31eef34be95 Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.534710 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mdvpb"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.542062 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-00e3-account-create-update-5hhsj"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.643244 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a29ba47-9a94-492f-8abd-c01b04d0b3c1" path="/var/lib/kubelet/pods/7a29ba47-9a94-492f-8abd-c01b04d0b3c1/volumes" Jan 29 17:04:30 crc kubenswrapper[4886]: W0129 17:04:30.722200 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bcdded9_ad2a_4fcc_82f1_0a13cf85b06d.slice/crio-01b4206a66380781bc1d5bf890de4dd2a4c91be01985eaaaf4ae95a14ceba772 WatchSource:0}: Error finding container 01b4206a66380781bc1d5bf890de4dd2a4c91be01985eaaaf4ae95a14ceba772: Status 404 returned error can't find the container with id 01b4206a66380781bc1d5bf890de4dd2a4c91be01985eaaaf4ae95a14ceba772 Jan 29 17:04:30 crc kubenswrapper[4886]: W0129 17:04:30.726807 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode489f203_c94a_4bbb_b22a_750bec963d77.slice/crio-3494f9c79f1c1ef413b78a2d49593156e0435e82f4c6ab83f28f950673f2985c WatchSource:0}: Error finding container 3494f9c79f1c1ef413b78a2d49593156e0435e82f4c6ab83f28f950673f2985c: Status 404 returned error can't find the container with id 3494f9c79f1c1ef413b78a2d49593156e0435e82f4c6ab83f28f950673f2985c Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.726875 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b7d9p-config-fbd7w"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.739782 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4vq4n"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.749822 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sgspp"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.848547 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fw887"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.849835 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.871184 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fw887"] Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.914711 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4xhg\" (UniqueName: \"kubernetes.io/projected/6479af73-81ef-4755-89b5-3a2dd44e99b3-kube-api-access-m4xhg\") pod \"mysqld-exporter-openstack-db-create-fw887\" (UID: \"6479af73-81ef-4755-89b5-3a2dd44e99b3\") " pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.914843 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6479af73-81ef-4755-89b5-3a2dd44e99b3-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-fw887\" (UID: \"6479af73-81ef-4755-89b5-3a2dd44e99b3\") " pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:30 crc kubenswrapper[4886]: I0129 17:04:30.917796 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-29gw9" podUID="4ef7b166-c078-4530-b05b-ae3e44088122" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: i/o timeout" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.015788 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4xhg\" (UniqueName: \"kubernetes.io/projected/6479af73-81ef-4755-89b5-3a2dd44e99b3-kube-api-access-m4xhg\") pod \"mysqld-exporter-openstack-db-create-fw887\" (UID: \"6479af73-81ef-4755-89b5-3a2dd44e99b3\") " pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.015906 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6479af73-81ef-4755-89b5-3a2dd44e99b3-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-fw887\" (UID: \"6479af73-81ef-4755-89b5-3a2dd44e99b3\") " pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.016616 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6479af73-81ef-4755-89b5-3a2dd44e99b3-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-fw887\" (UID: \"6479af73-81ef-4755-89b5-3a2dd44e99b3\") " pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.038987 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4xhg\" (UniqueName: \"kubernetes.io/projected/6479af73-81ef-4755-89b5-3a2dd44e99b3-kube-api-access-m4xhg\") pod \"mysqld-exporter-openstack-db-create-fw887\" (UID: \"6479af73-81ef-4755-89b5-3a2dd44e99b3\") " pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.208287 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d860-account-create-update-5kd66" event={"ID":"66c16915-30cc-4a4f-81ff-4b82cf152968","Type":"ContainerStarted","Data":"4b1a89009d472fe5b2dceb7b8a0b8294983468e34c2707bffbc7bce6c3368172"} Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.209493 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-00e3-account-create-update-5hhsj" event={"ID":"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a","Type":"ContainerStarted","Data":"75581e1d16d26560497cc9988813329216f56a92bcacbc7cddb3b31eef34be95"} Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.210874 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f0b5-account-create-update-8b8vz" event={"ID":"29921ec8-f68f-4547-a2c0-d4d3f5de6960","Type":"ContainerStarted","Data":"e3585e24c6e310ab66cc3acdb8b7196a729aef835b23a64db0aa1d39659b162c"} Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.231940 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4vq4n" event={"ID":"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d","Type":"ContainerStarted","Data":"01b4206a66380781bc1d5bf890de4dd2a4c91be01985eaaaf4ae95a14ceba772"} Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.233722 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b7d9p-config-fbd7w" event={"ID":"e489f203-c94a-4bbb-b22a-750bec963d77","Type":"ContainerStarted","Data":"3494f9c79f1c1ef413b78a2d49593156e0435e82f4c6ab83f28f950673f2985c"} Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.234815 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mdvpb" event={"ID":"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe","Type":"ContainerStarted","Data":"b50b1c67e2972d88bd8981e1a3db87ee14511c02cd94a92c47a372ec32761177"} Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.236017 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b0be43b-8956-45aa-ad50-de9183b3fea3","Type":"ContainerStarted","Data":"215a0a427916185913ef03f036755684e9f8fb11bc8d8ec6645e74d9b4d6fab0"} Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.236725 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sgspp" event={"ID":"b696cd6b-840b-4505-9010-114d223a90e9","Type":"ContainerStarted","Data":"1e72a81ebd6c0cbcca3631d9164e1b3194deb99d97abb1a18f67baa27d377916"} Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.283435 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.528436 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-5ab6-account-create-update-4xrnn"] Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.530076 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.533505 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.553770 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-5ab6-account-create-update-4xrnn"] Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.629184 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c996a30-f53d-49f1-a7d1-2ca23704b48e-operator-scripts\") pod \"mysqld-exporter-5ab6-account-create-update-4xrnn\" (UID: \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\") " pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.629252 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6pj\" (UniqueName: \"kubernetes.io/projected/7c996a30-f53d-49f1-a7d1-2ca23704b48e-kube-api-access-7n6pj\") pod \"mysqld-exporter-5ab6-account-create-update-4xrnn\" (UID: \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\") " pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.731783 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6pj\" (UniqueName: \"kubernetes.io/projected/7c996a30-f53d-49f1-a7d1-2ca23704b48e-kube-api-access-7n6pj\") pod \"mysqld-exporter-5ab6-account-create-update-4xrnn\" (UID: \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\") " pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.732065 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c996a30-f53d-49f1-a7d1-2ca23704b48e-operator-scripts\") pod \"mysqld-exporter-5ab6-account-create-update-4xrnn\" (UID: \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\") " pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.732909 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c996a30-f53d-49f1-a7d1-2ca23704b48e-operator-scripts\") pod \"mysqld-exporter-5ab6-account-create-update-4xrnn\" (UID: \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\") " pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.751753 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6pj\" (UniqueName: \"kubernetes.io/projected/7c996a30-f53d-49f1-a7d1-2ca23704b48e-kube-api-access-7n6pj\") pod \"mysqld-exporter-5ab6-account-create-update-4xrnn\" (UID: \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\") " pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.853792 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:31 crc kubenswrapper[4886]: I0129 17:04:31.861452 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fw887"] Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.198175 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ff68z"] Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.229007 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.240995 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.266817 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f0b5-account-create-update-8b8vz" event={"ID":"29921ec8-f68f-4547-a2c0-d4d3f5de6960","Type":"ContainerStarted","Data":"bb6b6c4443538f6a82366349284b39cf96fcba5ff7da991fc88f83ec4dbea3cd"} Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.269437 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9d0db9ae-746b-419a-bc61-bf85645d2bff","Type":"ContainerStarted","Data":"d1dc3fb46e158387bf0f32779951559ae37a47a019dba0a8cc0c029c48708606"} Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.270655 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ff68z"] Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.271309 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"842bfe4d-04ba-4143-9076-3033163c7b82","Type":"ContainerStarted","Data":"08d69a0d8dd87ebbab66b41851c9555c89b1c9518edbf660dc3fb4f99c870c1b"} Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.274702 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10","Type":"ContainerStarted","Data":"0f7d7bba0e7f3ae79ef50440ef9e40b86880917e00b15ccefc3f045f4186b63e"} Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.279787 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-fw887" event={"ID":"6479af73-81ef-4755-89b5-3a2dd44e99b3","Type":"ContainerStarted","Data":"467dace8916b0217ae148ecca1b8485085023c2a93c1b1258e47bf9de86c975f"} Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.281885 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d860-account-create-update-5kd66" event={"ID":"66c16915-30cc-4a4f-81ff-4b82cf152968","Type":"ContainerStarted","Data":"dae301d02f31a6be0962a543705953e6d92f427e7aa9bc8443d7688a4f7705a4"} Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.314464 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-5ab6-account-create-update-4xrnn"] Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.345088 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc8nt\" (UniqueName: \"kubernetes.io/projected/9b69834e-55cc-4ec2-b451-fafe1f417c53-kube-api-access-qc8nt\") pod \"root-account-create-update-ff68z\" (UID: \"9b69834e-55cc-4ec2-b451-fafe1f417c53\") " pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.345198 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b69834e-55cc-4ec2-b451-fafe1f417c53-operator-scripts\") pod \"root-account-create-update-ff68z\" (UID: \"9b69834e-55cc-4ec2-b451-fafe1f417c53\") " pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.447477 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc8nt\" (UniqueName: \"kubernetes.io/projected/9b69834e-55cc-4ec2-b451-fafe1f417c53-kube-api-access-qc8nt\") pod \"root-account-create-update-ff68z\" (UID: \"9b69834e-55cc-4ec2-b451-fafe1f417c53\") " pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.447595 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b69834e-55cc-4ec2-b451-fafe1f417c53-operator-scripts\") pod \"root-account-create-update-ff68z\" (UID: \"9b69834e-55cc-4ec2-b451-fafe1f417c53\") " pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.448531 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b69834e-55cc-4ec2-b451-fafe1f417c53-operator-scripts\") pod \"root-account-create-update-ff68z\" (UID: \"9b69834e-55cc-4ec2-b451-fafe1f417c53\") " pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.468265 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc8nt\" (UniqueName: \"kubernetes.io/projected/9b69834e-55cc-4ec2-b451-fafe1f417c53-kube-api-access-qc8nt\") pod \"root-account-create-update-ff68z\" (UID: \"9b69834e-55cc-4ec2-b451-fafe1f417c53\") " pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:32 crc kubenswrapper[4886]: I0129 17:04:32.759309 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:33 crc kubenswrapper[4886]: I0129 17:04:33.296130 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ff68z"] Jan 29 17:04:33 crc kubenswrapper[4886]: W0129 17:04:33.297671 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b69834e_55cc_4ec2_b451_fafe1f417c53.slice/crio-0e84b35431f435c10da1a1d55797c5bcb58d9704217c007ee48b93dde2741c31 WatchSource:0}: Error finding container 0e84b35431f435c10da1a1d55797c5bcb58d9704217c007ee48b93dde2741c31: Status 404 returned error can't find the container with id 0e84b35431f435c10da1a1d55797c5bcb58d9704217c007ee48b93dde2741c31 Jan 29 17:04:33 crc kubenswrapper[4886]: I0129 17:04:33.298344 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b7d9p-config-fbd7w" event={"ID":"e489f203-c94a-4bbb-b22a-750bec963d77","Type":"ContainerStarted","Data":"6412eac490b1fbd3d0b00a59dd461a3eb98d94b486a8096aadd0a5be64624a01"} Jan 29 17:04:33 crc kubenswrapper[4886]: I0129 17:04:33.299719 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mdvpb" event={"ID":"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe","Type":"ContainerStarted","Data":"cbbd4f5360c0e0e269db9be0e3b0c9d872ff0fa28897b05c76dba7a51c4b1e4c"} Jan 29 17:04:33 crc kubenswrapper[4886]: I0129 17:04:33.300877 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" event={"ID":"7c996a30-f53d-49f1-a7d1-2ca23704b48e","Type":"ContainerStarted","Data":"02ae7964e4db04590375f8dc8b2d4e000ef65dea8116644a045a8c2fec3c1786"} Jan 29 17:04:33 crc kubenswrapper[4886]: I0129 17:04:33.302947 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sgspp" event={"ID":"b696cd6b-840b-4505-9010-114d223a90e9","Type":"ContainerStarted","Data":"11300dda6841f3bcadbf8fc0b293c71f220072872935dad2eeec46ba483d2773"} Jan 29 17:04:33 crc kubenswrapper[4886]: I0129 17:04:33.305855 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-00e3-account-create-update-5hhsj" event={"ID":"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a","Type":"ContainerStarted","Data":"20030a467bab27996b15106f17b7491349b629c6d6de493fc3b1efb1f226e72c"} Jan 29 17:04:33 crc kubenswrapper[4886]: I0129 17:04:33.305896 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 29 17:04:33 crc kubenswrapper[4886]: I0129 17:04:33.334837 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=47.28895009 podStartE2EDuration="2m10.3348213s" podCreationTimestamp="2026-01-29 17:02:23 +0000 UTC" firstStartedPulling="2026-01-29 17:02:26.222949197 +0000 UTC m=+2429.131668479" lastFinishedPulling="2026-01-29 17:03:49.268820417 +0000 UTC m=+2512.177539689" observedRunningTime="2026-01-29 17:04:33.331216501 +0000 UTC m=+2556.239935773" watchObservedRunningTime="2026-01-29 17:04:33.3348213 +0000 UTC m=+2556.243540572" Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.043973 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-b7d9p" Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.318309 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ff68z" event={"ID":"9b69834e-55cc-4ec2-b451-fafe1f417c53","Type":"ContainerStarted","Data":"0e84b35431f435c10da1a1d55797c5bcb58d9704217c007ee48b93dde2741c31"} Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.320839 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4vq4n" event={"ID":"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d","Type":"ContainerStarted","Data":"fbecb6255a3f2d33607adb71963134e7eb4f057014a12ad026702a5429304db4"} Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.321140 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.321381 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.352345 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371906.502464 podStartE2EDuration="2m10.352311364s" podCreationTimestamp="2026-01-29 17:02:24 +0000 UTC" firstStartedPulling="2026-01-29 17:02:26.89804805 +0000 UTC m=+2429.806767322" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:34.340278513 +0000 UTC m=+2557.248997795" watchObservedRunningTime="2026-01-29 17:04:34.352311364 +0000 UTC m=+2557.261030636" Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.365815 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=49.452227729 podStartE2EDuration="2m11.365793055s" podCreationTimestamp="2026-01-29 17:02:23 +0000 UTC" firstStartedPulling="2026-01-29 17:02:26.484539421 +0000 UTC m=+2429.393258693" lastFinishedPulling="2026-01-29 17:03:48.398104747 +0000 UTC m=+2511.306824019" observedRunningTime="2026-01-29 17:04:34.360710876 +0000 UTC m=+2557.269430168" watchObservedRunningTime="2026-01-29 17:04:34.365793055 +0000 UTC m=+2557.274512327" Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.408752 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f0b5-account-create-update-8b8vz" podStartSLOduration=10.408732388 podStartE2EDuration="10.408732388s" podCreationTimestamp="2026-01-29 17:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:34.385251981 +0000 UTC m=+2557.293971263" watchObservedRunningTime="2026-01-29 17:04:34.408732388 +0000 UTC m=+2557.317451650" Jan 29 17:04:34 crc kubenswrapper[4886]: I0129 17:04:34.410690 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=49.181223366 podStartE2EDuration="2m11.410682052s" podCreationTimestamp="2026-01-29 17:02:23 +0000 UTC" firstStartedPulling="2026-01-29 17:02:26.168206379 +0000 UTC m=+2429.076925651" lastFinishedPulling="2026-01-29 17:03:48.397665065 +0000 UTC m=+2511.306384337" observedRunningTime="2026-01-29 17:04:34.401849369 +0000 UTC m=+2557.310568661" watchObservedRunningTime="2026-01-29 17:04:34.410682052 +0000 UTC m=+2557.319401334" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.331355 4886 generic.go:334] "Generic (PLEG): container finished" podID="e489f203-c94a-4bbb-b22a-750bec963d77" containerID="6412eac490b1fbd3d0b00a59dd461a3eb98d94b486a8096aadd0a5be64624a01" exitCode=0 Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.331421 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b7d9p-config-fbd7w" event={"ID":"e489f203-c94a-4bbb-b22a-750bec963d77","Type":"ContainerDied","Data":"6412eac490b1fbd3d0b00a59dd461a3eb98d94b486a8096aadd0a5be64624a01"} Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.333627 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" event={"ID":"7c996a30-f53d-49f1-a7d1-2ca23704b48e","Type":"ContainerStarted","Data":"5019558a9253bbef2f27d289d48dcc75d2b0f7a1469d88aa8fb186da0d61df99"} Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.335284 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-fw887" event={"ID":"6479af73-81ef-4755-89b5-3a2dd44e99b3","Type":"ContainerStarted","Data":"0341a2566f1bb6385e4ca19bd7599e154fd2818c69290a143a8dae194ef6f346"} Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.337102 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerStarted","Data":"36870feb46aff15218a1df0a6e9d4aa854998ebadaa74a5a50b3e39905ffbc8c"} Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.339213 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ff68z" event={"ID":"9b69834e-55cc-4ec2-b451-fafe1f417c53","Type":"ContainerStarted","Data":"6e26b828a472fc3b1df8fa1fda19373a058c84b6a577b9a6475d17f33176e5c8"} Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.389746 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d860-account-create-update-5kd66" podStartSLOduration=7.389723446 podStartE2EDuration="7.389723446s" podCreationTimestamp="2026-01-29 17:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:35.379786533 +0000 UTC m=+2558.288505805" watchObservedRunningTime="2026-01-29 17:04:35.389723446 +0000 UTC m=+2558.298442718" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.411946 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" podStartSLOduration=4.411921427 podStartE2EDuration="4.411921427s" podCreationTimestamp="2026-01-29 17:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:35.399100754 +0000 UTC m=+2558.307820036" watchObservedRunningTime="2026-01-29 17:04:35.411921427 +0000 UTC m=+2558.320640699" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.426128 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-00e3-account-create-update-5hhsj" podStartSLOduration=7.426110388 podStartE2EDuration="7.426110388s" podCreationTimestamp="2026-01-29 17:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:35.420671948 +0000 UTC m=+2558.329391230" watchObservedRunningTime="2026-01-29 17:04:35.426110388 +0000 UTC m=+2558.334829660" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.444015 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.445745 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-4vq4n" podStartSLOduration=7.445722118 podStartE2EDuration="7.445722118s" podCreationTimestamp="2026-01-29 17:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:35.435172038 +0000 UTC m=+2558.343891320" watchObservedRunningTime="2026-01-29 17:04:35.445722118 +0000 UTC m=+2558.354441390" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.466639 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-ff68z" podStartSLOduration=3.466618794 podStartE2EDuration="3.466618794s" podCreationTimestamp="2026-01-29 17:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:35.449681987 +0000 UTC m=+2558.358401249" watchObservedRunningTime="2026-01-29 17:04:35.466618794 +0000 UTC m=+2558.375338066" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.471367 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-mdvpb" podStartSLOduration=11.471354024 podStartE2EDuration="11.471354024s" podCreationTimestamp="2026-01-29 17:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:35.462555742 +0000 UTC m=+2558.371275004" watchObservedRunningTime="2026-01-29 17:04:35.471354024 +0000 UTC m=+2558.380073296" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.486912 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-fw887" podStartSLOduration=5.486893702 podStartE2EDuration="5.486893702s" podCreationTimestamp="2026-01-29 17:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:35.473297578 +0000 UTC m=+2558.382016860" watchObservedRunningTime="2026-01-29 17:04:35.486893702 +0000 UTC m=+2558.395612974" Jan 29 17:04:35 crc kubenswrapper[4886]: I0129 17:04:35.494373 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-sgspp" podStartSLOduration=7.494358908 podStartE2EDuration="7.494358908s" podCreationTimestamp="2026-01-29 17:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:04:35.485908845 +0000 UTC m=+2558.394628137" watchObservedRunningTime="2026-01-29 17:04:35.494358908 +0000 UTC m=+2558.403078180" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.012173 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.125902 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run\") pod \"e489f203-c94a-4bbb-b22a-750bec963d77\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126012 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-additional-scripts\") pod \"e489f203-c94a-4bbb-b22a-750bec963d77\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126039 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-log-ovn\") pod \"e489f203-c94a-4bbb-b22a-750bec963d77\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126111 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-scripts\") pod \"e489f203-c94a-4bbb-b22a-750bec963d77\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126103 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run" (OuterVolumeSpecName: "var-run") pod "e489f203-c94a-4bbb-b22a-750bec963d77" (UID: "e489f203-c94a-4bbb-b22a-750bec963d77"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126149 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run-ovn\") pod \"e489f203-c94a-4bbb-b22a-750bec963d77\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126175 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e489f203-c94a-4bbb-b22a-750bec963d77" (UID: "e489f203-c94a-4bbb-b22a-750bec963d77"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126209 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e489f203-c94a-4bbb-b22a-750bec963d77" (UID: "e489f203-c94a-4bbb-b22a-750bec963d77"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126381 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7nwl\" (UniqueName: \"kubernetes.io/projected/e489f203-c94a-4bbb-b22a-750bec963d77-kube-api-access-j7nwl\") pod \"e489f203-c94a-4bbb-b22a-750bec963d77\" (UID: \"e489f203-c94a-4bbb-b22a-750bec963d77\") " Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.126740 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e489f203-c94a-4bbb-b22a-750bec963d77" (UID: "e489f203-c94a-4bbb-b22a-750bec963d77"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.127075 4886 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.127096 4886 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.127108 4886 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.127117 4886 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e489f203-c94a-4bbb-b22a-750bec963d77-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.127073 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-scripts" (OuterVolumeSpecName: "scripts") pod "e489f203-c94a-4bbb-b22a-750bec963d77" (UID: "e489f203-c94a-4bbb-b22a-750bec963d77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.139699 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e489f203-c94a-4bbb-b22a-750bec963d77-kube-api-access-j7nwl" (OuterVolumeSpecName: "kube-api-access-j7nwl") pod "e489f203-c94a-4bbb-b22a-750bec963d77" (UID: "e489f203-c94a-4bbb-b22a-750bec963d77"). InnerVolumeSpecName "kube-api-access-j7nwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.229344 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7nwl\" (UniqueName: \"kubernetes.io/projected/e489f203-c94a-4bbb-b22a-750bec963d77-kube-api-access-j7nwl\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.229375 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e489f203-c94a-4bbb-b22a-750bec963d77-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.440378 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b7d9p-config-fbd7w" event={"ID":"e489f203-c94a-4bbb-b22a-750bec963d77","Type":"ContainerDied","Data":"3494f9c79f1c1ef413b78a2d49593156e0435e82f4c6ab83f28f950673f2985c"} Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.440422 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3494f9c79f1c1ef413b78a2d49593156e0435e82f4c6ab83f28f950673f2985c" Jan 29 17:04:40 crc kubenswrapper[4886]: I0129 17:04:40.440431 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b7d9p-config-fbd7w" Jan 29 17:04:41 crc kubenswrapper[4886]: I0129 17:04:41.111031 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-b7d9p-config-fbd7w"] Jan 29 17:04:41 crc kubenswrapper[4886]: I0129 17:04:41.121303 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-b7d9p-config-fbd7w"] Jan 29 17:04:42 crc kubenswrapper[4886]: I0129 17:04:42.615033 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:04:42 crc kubenswrapper[4886]: E0129 17:04:42.615585 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:04:42 crc kubenswrapper[4886]: I0129 17:04:42.658181 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e489f203-c94a-4bbb-b22a-750bec963d77" path="/var/lib/kubelet/pods/e489f203-c94a-4bbb-b22a-750bec963d77/volumes" Jan 29 17:04:44 crc kubenswrapper[4886]: I0129 17:04:44.310337 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:04:44 crc kubenswrapper[4886]: E0129 17:04:44.310582 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 17:04:44 crc kubenswrapper[4886]: E0129 17:04:44.310707 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 17:04:44 crc kubenswrapper[4886]: E0129 17:04:44.310759 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift podName:6e2f2c6c-bc32-4a32-ba2c-8954d277ce47 nodeName:}" failed. No retries permitted until 2026-01-29 17:05:16.310743841 +0000 UTC m=+2599.219463113 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift") pod "swift-storage-0" (UID: "6e2f2c6c-bc32-4a32-ba2c-8954d277ce47") : configmap "swift-ring-files" not found Jan 29 17:04:45 crc kubenswrapper[4886]: I0129 17:04:45.299984 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2b0be43b-8956-45aa-ad50-de9183b3fea3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.144:5671: connect: connection refused" Jan 29 17:04:45 crc kubenswrapper[4886]: I0129 17:04:45.452986 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.146:5671: connect: connection refused" Jan 29 17:04:45 crc kubenswrapper[4886]: I0129 17:04:45.639138 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="842bfe4d-04ba-4143-9076-3033163c7b82" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.145:5671: connect: connection refused" Jan 29 17:04:45 crc kubenswrapper[4886]: I0129 17:04:45.968898 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9d0db9ae-746b-419a-bc61-bf85645d2bff" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.147:5671: connect: connection refused" Jan 29 17:04:48 crc kubenswrapper[4886]: I0129 17:04:48.512123 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s7294" event={"ID":"ebccb3a0-d421-4c30-9201-43e9106e4006","Type":"ContainerStarted","Data":"b9499d28202d4957e50821e930ae2c95870e6ae3730a64237a2f9f54f953765c"} Jan 29 17:04:49 crc kubenswrapper[4886]: E0129 17:04:49.072386 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34" Jan 29 17:04:49 crc kubenswrapper[4886]: E0129 17:04:49.072730 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:thanos-sidecar,Image:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34,Command:[],Args:[sidecar --prometheus.url=http://localhost:9090/ --grpc-address=:10901 --http-address=:10902 --log.level=info --prometheus.http-client-file=/etc/thanos/config/prometheus.http-client-file.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:10902,Protocol:TCP,HostIP:,},ContainerPort{Name:grpc,HostPort:0,ContainerPort:10901,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:thanos-prometheus-http-client-file,ReadOnly:false,MountPath:/etc/thanos/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w2cnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(ce7955a1-eb58-425a-872a-7ec102b8e090): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 17:04:49 crc kubenswrapper[4886]: E0129 17:04:49.073921 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\", failed to \"StartContainer\" for \"thanos-sidecar\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/prometheus-metric-storage-0" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" Jan 29 17:04:49 crc kubenswrapper[4886]: I0129 17:04:49.521889 4886 generic.go:334] "Generic (PLEG): container finished" podID="9b69834e-55cc-4ec2-b451-fafe1f417c53" containerID="6e26b828a472fc3b1df8fa1fda19373a058c84b6a577b9a6475d17f33176e5c8" exitCode=0 Jan 29 17:04:49 crc kubenswrapper[4886]: I0129 17:04:49.521992 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ff68z" event={"ID":"9b69834e-55cc-4ec2-b451-fafe1f417c53","Type":"ContainerDied","Data":"6e26b828a472fc3b1df8fa1fda19373a058c84b6a577b9a6475d17f33176e5c8"} Jan 29 17:04:49 crc kubenswrapper[4886]: I0129 17:04:49.525072 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:04:49 crc kubenswrapper[4886]: I0129 17:04:49.576807 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-s7294" podStartSLOduration=9.865146722 podStartE2EDuration="37.576782039s" podCreationTimestamp="2026-01-29 17:04:12 +0000 UTC" firstStartedPulling="2026-01-29 17:04:13.842912832 +0000 UTC m=+2536.751632104" lastFinishedPulling="2026-01-29 17:04:41.554548149 +0000 UTC m=+2564.463267421" observedRunningTime="2026-01-29 17:04:49.568216793 +0000 UTC m=+2572.476936075" watchObservedRunningTime="2026-01-29 17:04:49.576782039 +0000 UTC m=+2572.485501311" Jan 29 17:04:50 crc kubenswrapper[4886]: I0129 17:04:50.963681 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.070963 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc8nt\" (UniqueName: \"kubernetes.io/projected/9b69834e-55cc-4ec2-b451-fafe1f417c53-kube-api-access-qc8nt\") pod \"9b69834e-55cc-4ec2-b451-fafe1f417c53\" (UID: \"9b69834e-55cc-4ec2-b451-fafe1f417c53\") " Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.071121 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b69834e-55cc-4ec2-b451-fafe1f417c53-operator-scripts\") pod \"9b69834e-55cc-4ec2-b451-fafe1f417c53\" (UID: \"9b69834e-55cc-4ec2-b451-fafe1f417c53\") " Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.071971 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b69834e-55cc-4ec2-b451-fafe1f417c53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b69834e-55cc-4ec2-b451-fafe1f417c53" (UID: "9b69834e-55cc-4ec2-b451-fafe1f417c53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.077608 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b69834e-55cc-4ec2-b451-fafe1f417c53-kube-api-access-qc8nt" (OuterVolumeSpecName: "kube-api-access-qc8nt") pod "9b69834e-55cc-4ec2-b451-fafe1f417c53" (UID: "9b69834e-55cc-4ec2-b451-fafe1f417c53"). InnerVolumeSpecName "kube-api-access-qc8nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.173893 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b69834e-55cc-4ec2-b451-fafe1f417c53-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.173930 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc8nt\" (UniqueName: \"kubernetes.io/projected/9b69834e-55cc-4ec2-b451-fafe1f417c53-kube-api-access-qc8nt\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.542582 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ff68z" event={"ID":"9b69834e-55cc-4ec2-b451-fafe1f417c53","Type":"ContainerDied","Data":"0e84b35431f435c10da1a1d55797c5bcb58d9704217c007ee48b93dde2741c31"} Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.542639 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e84b35431f435c10da1a1d55797c5bcb58d9704217c007ee48b93dde2741c31" Jan 29 17:04:51 crc kubenswrapper[4886]: I0129 17:04:51.542708 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ff68z" Jan 29 17:04:53 crc kubenswrapper[4886]: I0129 17:04:53.615106 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:04:53 crc kubenswrapper[4886]: E0129 17:04:53.615948 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:04:54 crc kubenswrapper[4886]: E0129 17:04:54.481176 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"thanos-sidecar\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.593869 4886 generic.go:334] "Generic (PLEG): container finished" podID="6479af73-81ef-4755-89b5-3a2dd44e99b3" containerID="0341a2566f1bb6385e4ca19bd7599e154fd2818c69290a143a8dae194ef6f346" exitCode=0 Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.593942 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-fw887" event={"ID":"6479af73-81ef-4755-89b5-3a2dd44e99b3","Type":"ContainerDied","Data":"0341a2566f1bb6385e4ca19bd7599e154fd2818c69290a143a8dae194ef6f346"} Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.595670 4886 generic.go:334] "Generic (PLEG): container finished" podID="66c16915-30cc-4a4f-81ff-4b82cf152968" containerID="dae301d02f31a6be0962a543705953e6d92f427e7aa9bc8443d7688a4f7705a4" exitCode=0 Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.595722 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d860-account-create-update-5kd66" event={"ID":"66c16915-30cc-4a4f-81ff-4b82cf152968","Type":"ContainerDied","Data":"dae301d02f31a6be0962a543705953e6d92f427e7aa9bc8443d7688a4f7705a4"} Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.600989 4886 generic.go:334] "Generic (PLEG): container finished" podID="aa302a57-5c6b-41b1-ac4b-7d9095b7b65a" containerID="20030a467bab27996b15106f17b7491349b629c6d6de493fc3b1efb1f226e72c" exitCode=0 Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.601052 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-00e3-account-create-update-5hhsj" event={"ID":"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a","Type":"ContainerDied","Data":"20030a467bab27996b15106f17b7491349b629c6d6de493fc3b1efb1f226e72c"} Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.604030 4886 generic.go:334] "Generic (PLEG): container finished" podID="29921ec8-f68f-4547-a2c0-d4d3f5de6960" containerID="bb6b6c4443538f6a82366349284b39cf96fcba5ff7da991fc88f83ec4dbea3cd" exitCode=0 Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.604087 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f0b5-account-create-update-8b8vz" event={"ID":"29921ec8-f68f-4547-a2c0-d4d3f5de6960","Type":"ContainerDied","Data":"bb6b6c4443538f6a82366349284b39cf96fcba5ff7da991fc88f83ec4dbea3cd"} Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.607905 4886 generic.go:334] "Generic (PLEG): container finished" podID="6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d" containerID="fbecb6255a3f2d33607adb71963134e7eb4f057014a12ad026702a5429304db4" exitCode=0 Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.607992 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4vq4n" event={"ID":"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d","Type":"ContainerDied","Data":"fbecb6255a3f2d33607adb71963134e7eb4f057014a12ad026702a5429304db4"} Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.609778 4886 generic.go:334] "Generic (PLEG): container finished" podID="9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe" containerID="cbbd4f5360c0e0e269db9be0e3b0c9d872ff0fa28897b05c76dba7a51c4b1e4c" exitCode=0 Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.609832 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mdvpb" event={"ID":"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe","Type":"ContainerDied","Data":"cbbd4f5360c0e0e269db9be0e3b0c9d872ff0fa28897b05c76dba7a51c4b1e4c"} Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.612288 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerStarted","Data":"3a9c53d5227fb7b0c6bf2e7197762b1a4d147cab6dde0f951e7924a558b5e58d"} Jan 29 17:04:54 crc kubenswrapper[4886]: E0129 17:04:54.614665 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"thanos-sidecar\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.621671 4886 generic.go:334] "Generic (PLEG): container finished" podID="7c996a30-f53d-49f1-a7d1-2ca23704b48e" containerID="5019558a9253bbef2f27d289d48dcc75d2b0f7a1469d88aa8fb186da0d61df99" exitCode=0 Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.623179 4886 generic.go:334] "Generic (PLEG): container finished" podID="b696cd6b-840b-4505-9010-114d223a90e9" containerID="11300dda6841f3bcadbf8fc0b293c71f220072872935dad2eeec46ba483d2773" exitCode=0 Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.627303 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" event={"ID":"7c996a30-f53d-49f1-a7d1-2ca23704b48e","Type":"ContainerDied","Data":"5019558a9253bbef2f27d289d48dcc75d2b0f7a1469d88aa8fb186da0d61df99"} Jan 29 17:04:54 crc kubenswrapper[4886]: I0129 17:04:54.630835 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sgspp" event={"ID":"b696cd6b-840b-4505-9010-114d223a90e9","Type":"ContainerDied","Data":"11300dda6841f3bcadbf8fc0b293c71f220072872935dad2eeec46ba483d2773"} Jan 29 17:04:55 crc kubenswrapper[4886]: I0129 17:04:55.297494 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2b0be43b-8956-45aa-ad50-de9183b3fea3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.144:5671: connect: connection refused" Jan 29 17:04:55 crc kubenswrapper[4886]: I0129 17:04:55.429690 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ff68z"] Jan 29 17:04:55 crc kubenswrapper[4886]: I0129 17:04:55.436975 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ff68z"] Jan 29 17:04:55 crc kubenswrapper[4886]: I0129 17:04:55.445049 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.146:5671: connect: connection refused" Jan 29 17:04:55 crc kubenswrapper[4886]: I0129 17:04:55.640352 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="842bfe4d-04ba-4143-9076-3033163c7b82" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.145:5671: connect: connection refused" Jan 29 17:04:55 crc kubenswrapper[4886]: I0129 17:04:55.966201 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9d0db9ae-746b-419a-bc61-bf85645d2bff" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.147:5671: connect: connection refused" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.523999 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.529132 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.541679 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.550113 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.556663 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.568221 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.585039 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.591730 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.593686 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n6pj\" (UniqueName: \"kubernetes.io/projected/7c996a30-f53d-49f1-a7d1-2ca23704b48e-kube-api-access-7n6pj\") pod \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\" (UID: \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.593740 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29921ec8-f68f-4547-a2c0-d4d3f5de6960-operator-scripts\") pod \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\" (UID: \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.593790 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hc79\" (UniqueName: \"kubernetes.io/projected/b696cd6b-840b-4505-9010-114d223a90e9-kube-api-access-8hc79\") pod \"b696cd6b-840b-4505-9010-114d223a90e9\" (UID: \"b696cd6b-840b-4505-9010-114d223a90e9\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.593828 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-operator-scripts\") pod \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\" (UID: \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.593904 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m47b2\" (UniqueName: \"kubernetes.io/projected/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-kube-api-access-m47b2\") pod \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\" (UID: \"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.593940 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxbwc\" (UniqueName: \"kubernetes.io/projected/29921ec8-f68f-4547-a2c0-d4d3f5de6960-kube-api-access-pxbwc\") pod \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\" (UID: \"29921ec8-f68f-4547-a2c0-d4d3f5de6960\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.594037 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b696cd6b-840b-4505-9010-114d223a90e9-operator-scripts\") pod \"b696cd6b-840b-4505-9010-114d223a90e9\" (UID: \"b696cd6b-840b-4505-9010-114d223a90e9\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.594071 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c996a30-f53d-49f1-a7d1-2ca23704b48e-operator-scripts\") pod \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\" (UID: \"7c996a30-f53d-49f1-a7d1-2ca23704b48e\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.594313 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa302a57-5c6b-41b1-ac4b-7d9095b7b65a" (UID: "aa302a57-5c6b-41b1-ac4b-7d9095b7b65a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.594686 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.594901 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b696cd6b-840b-4505-9010-114d223a90e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b696cd6b-840b-4505-9010-114d223a90e9" (UID: "b696cd6b-840b-4505-9010-114d223a90e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.594926 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c996a30-f53d-49f1-a7d1-2ca23704b48e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c996a30-f53d-49f1-a7d1-2ca23704b48e" (UID: "7c996a30-f53d-49f1-a7d1-2ca23704b48e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.595128 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29921ec8-f68f-4547-a2c0-d4d3f5de6960-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29921ec8-f68f-4547-a2c0-d4d3f5de6960" (UID: "29921ec8-f68f-4547-a2c0-d4d3f5de6960"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.608101 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29921ec8-f68f-4547-a2c0-d4d3f5de6960-kube-api-access-pxbwc" (OuterVolumeSpecName: "kube-api-access-pxbwc") pod "29921ec8-f68f-4547-a2c0-d4d3f5de6960" (UID: "29921ec8-f68f-4547-a2c0-d4d3f5de6960"). InnerVolumeSpecName "kube-api-access-pxbwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.608188 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-kube-api-access-m47b2" (OuterVolumeSpecName: "kube-api-access-m47b2") pod "aa302a57-5c6b-41b1-ac4b-7d9095b7b65a" (UID: "aa302a57-5c6b-41b1-ac4b-7d9095b7b65a"). InnerVolumeSpecName "kube-api-access-m47b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.626828 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b696cd6b-840b-4505-9010-114d223a90e9-kube-api-access-8hc79" (OuterVolumeSpecName: "kube-api-access-8hc79") pod "b696cd6b-840b-4505-9010-114d223a90e9" (UID: "b696cd6b-840b-4505-9010-114d223a90e9"). InnerVolumeSpecName "kube-api-access-8hc79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.627006 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c996a30-f53d-49f1-a7d1-2ca23704b48e-kube-api-access-7n6pj" (OuterVolumeSpecName: "kube-api-access-7n6pj") pod "7c996a30-f53d-49f1-a7d1-2ca23704b48e" (UID: "7c996a30-f53d-49f1-a7d1-2ca23704b48e"). InnerVolumeSpecName "kube-api-access-7n6pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.664476 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d860-account-create-update-5kd66" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.681164 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4vq4n" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.692181 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b69834e-55cc-4ec2-b451-fafe1f417c53" path="/var/lib/kubelet/pods/9b69834e-55cc-4ec2-b451-fafe1f417c53/volumes" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.696346 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4xhg\" (UniqueName: \"kubernetes.io/projected/6479af73-81ef-4755-89b5-3a2dd44e99b3-kube-api-access-m4xhg\") pod \"6479af73-81ef-4755-89b5-3a2dd44e99b3\" (UID: \"6479af73-81ef-4755-89b5-3a2dd44e99b3\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.696451 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhjq\" (UniqueName: \"kubernetes.io/projected/66c16915-30cc-4a4f-81ff-4b82cf152968-kube-api-access-lzhjq\") pod \"66c16915-30cc-4a4f-81ff-4b82cf152968\" (UID: \"66c16915-30cc-4a4f-81ff-4b82cf152968\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.696488 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8547\" (UniqueName: \"kubernetes.io/projected/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-kube-api-access-n8547\") pod \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\" (UID: \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.696571 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-operator-scripts\") pod \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\" (UID: \"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.696693 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-operator-scripts\") pod \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\" (UID: \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.696782 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c16915-30cc-4a4f-81ff-4b82cf152968-operator-scripts\") pod \"66c16915-30cc-4a4f-81ff-4b82cf152968\" (UID: \"66c16915-30cc-4a4f-81ff-4b82cf152968\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.698499 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2mjv\" (UniqueName: \"kubernetes.io/projected/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-kube-api-access-s2mjv\") pod \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\" (UID: \"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.698591 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6479af73-81ef-4755-89b5-3a2dd44e99b3-operator-scripts\") pod \"6479af73-81ef-4755-89b5-3a2dd44e99b3\" (UID: \"6479af73-81ef-4755-89b5-3a2dd44e99b3\") " Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.699374 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d" (UID: "6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.699589 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6479af73-81ef-4755-89b5-3a2dd44e99b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6479af73-81ef-4755-89b5-3a2dd44e99b3" (UID: "6479af73-81ef-4755-89b5-3a2dd44e99b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.699963 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe" (UID: "9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.704814 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c16915-30cc-4a4f-81ff-4b82cf152968-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66c16915-30cc-4a4f-81ff-4b82cf152968" (UID: "66c16915-30cc-4a4f-81ff-4b82cf152968"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.708773 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mdvpb" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.710234 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6479af73-81ef-4755-89b5-3a2dd44e99b3-kube-api-access-m4xhg" (OuterVolumeSpecName: "kube-api-access-m4xhg") pod "6479af73-81ef-4755-89b5-3a2dd44e99b3" (UID: "6479af73-81ef-4755-89b5-3a2dd44e99b3"). InnerVolumeSpecName "kube-api-access-m4xhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.710647 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-kube-api-access-s2mjv" (OuterVolumeSpecName: "kube-api-access-s2mjv") pod "9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe" (UID: "9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe"). InnerVolumeSpecName "kube-api-access-s2mjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.711081 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-kube-api-access-n8547" (OuterVolumeSpecName: "kube-api-access-n8547") pod "6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d" (UID: "6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d"). InnerVolumeSpecName "kube-api-access-n8547". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712022 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712344 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hc79\" (UniqueName: \"kubernetes.io/projected/b696cd6b-840b-4505-9010-114d223a90e9-kube-api-access-8hc79\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712445 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4xhg\" (UniqueName: \"kubernetes.io/projected/6479af73-81ef-4755-89b5-3a2dd44e99b3-kube-api-access-m4xhg\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712522 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m47b2\" (UniqueName: \"kubernetes.io/projected/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a-kube-api-access-m47b2\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712594 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8547\" (UniqueName: \"kubernetes.io/projected/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-kube-api-access-n8547\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712699 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxbwc\" (UniqueName: \"kubernetes.io/projected/29921ec8-f68f-4547-a2c0-d4d3f5de6960-kube-api-access-pxbwc\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712731 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712748 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712761 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b696cd6b-840b-4505-9010-114d223a90e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712790 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c996a30-f53d-49f1-a7d1-2ca23704b48e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712803 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c16915-30cc-4a4f-81ff-4b82cf152968-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712820 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2mjv\" (UniqueName: \"kubernetes.io/projected/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe-kube-api-access-s2mjv\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712833 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n6pj\" (UniqueName: \"kubernetes.io/projected/7c996a30-f53d-49f1-a7d1-2ca23704b48e-kube-api-access-7n6pj\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712846 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29921ec8-f68f-4547-a2c0-d4d3f5de6960-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.712862 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6479af73-81ef-4755-89b5-3a2dd44e99b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.718615 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c16915-30cc-4a4f-81ff-4b82cf152968-kube-api-access-lzhjq" (OuterVolumeSpecName: "kube-api-access-lzhjq") pod "66c16915-30cc-4a4f-81ff-4b82cf152968" (UID: "66c16915-30cc-4a4f-81ff-4b82cf152968"). InnerVolumeSpecName "kube-api-access-lzhjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.719208 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sgspp" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.720794 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-fw887" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727097 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-00e3-account-create-update-5hhsj" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727840 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d860-account-create-update-5kd66" event={"ID":"66c16915-30cc-4a4f-81ff-4b82cf152968","Type":"ContainerDied","Data":"4b1a89009d472fe5b2dceb7b8a0b8294983468e34c2707bffbc7bce6c3368172"} Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727895 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b1a89009d472fe5b2dceb7b8a0b8294983468e34c2707bffbc7bce6c3368172" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727918 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4vq4n" event={"ID":"6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d","Type":"ContainerDied","Data":"01b4206a66380781bc1d5bf890de4dd2a4c91be01985eaaaf4ae95a14ceba772"} Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727930 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01b4206a66380781bc1d5bf890de4dd2a4c91be01985eaaaf4ae95a14ceba772" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727940 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mdvpb" event={"ID":"9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe","Type":"ContainerDied","Data":"b50b1c67e2972d88bd8981e1a3db87ee14511c02cd94a92c47a372ec32761177"} Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727951 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b50b1c67e2972d88bd8981e1a3db87ee14511c02cd94a92c47a372ec32761177" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727960 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-5ab6-account-create-update-4xrnn" event={"ID":"7c996a30-f53d-49f1-a7d1-2ca23704b48e","Type":"ContainerDied","Data":"02ae7964e4db04590375f8dc8b2d4e000ef65dea8116644a045a8c2fec3c1786"} Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727972 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ae7964e4db04590375f8dc8b2d4e000ef65dea8116644a045a8c2fec3c1786" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727980 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sgspp" event={"ID":"b696cd6b-840b-4505-9010-114d223a90e9","Type":"ContainerDied","Data":"1e72a81ebd6c0cbcca3631d9164e1b3194deb99d97abb1a18f67baa27d377916"} Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727990 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e72a81ebd6c0cbcca3631d9164e1b3194deb99d97abb1a18f67baa27d377916" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.727999 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-fw887" event={"ID":"6479af73-81ef-4755-89b5-3a2dd44e99b3","Type":"ContainerDied","Data":"467dace8916b0217ae148ecca1b8485085023c2a93c1b1258e47bf9de86c975f"} Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.728011 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="467dace8916b0217ae148ecca1b8485085023c2a93c1b1258e47bf9de86c975f" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.728021 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-00e3-account-create-update-5hhsj" event={"ID":"aa302a57-5c6b-41b1-ac4b-7d9095b7b65a","Type":"ContainerDied","Data":"75581e1d16d26560497cc9988813329216f56a92bcacbc7cddb3b31eef34be95"} Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.728031 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75581e1d16d26560497cc9988813329216f56a92bcacbc7cddb3b31eef34be95" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.731577 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f0b5-account-create-update-8b8vz" event={"ID":"29921ec8-f68f-4547-a2c0-d4d3f5de6960","Type":"ContainerDied","Data":"e3585e24c6e310ab66cc3acdb8b7196a729aef835b23a64db0aa1d39659b162c"} Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.731615 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3585e24c6e310ab66cc3acdb8b7196a729aef835b23a64db0aa1d39659b162c" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.731689 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f0b5-account-create-update-8b8vz" Jan 29 17:04:56 crc kubenswrapper[4886]: I0129 17:04:56.815370 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzhjq\" (UniqueName: \"kubernetes.io/projected/66c16915-30cc-4a4f-81ff-4b82cf152968-kube-api-access-lzhjq\") on node \"crc\" DevicePath \"\"" Jan 29 17:04:57 crc kubenswrapper[4886]: I0129 17:04:57.463571 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 29 17:04:57 crc kubenswrapper[4886]: E0129 17:04:57.466805 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"thanos-sidecar\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.725262 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-thqn5"] Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.727494 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.727602 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.727677 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b696cd6b-840b-4505-9010-114d223a90e9" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.727781 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b696cd6b-840b-4505-9010-114d223a90e9" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.727867 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b69834e-55cc-4ec2-b451-fafe1f417c53" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.727938 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b69834e-55cc-4ec2-b451-fafe1f417c53" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.728014 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6479af73-81ef-4755-89b5-3a2dd44e99b3" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.729479 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6479af73-81ef-4755-89b5-3a2dd44e99b3" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.729582 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29921ec8-f68f-4547-a2c0-d4d3f5de6960" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.729652 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="29921ec8-f68f-4547-a2c0-d4d3f5de6960" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.729715 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c16915-30cc-4a4f-81ff-4b82cf152968" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.729763 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c16915-30cc-4a4f-81ff-4b82cf152968" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.729827 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c996a30-f53d-49f1-a7d1-2ca23704b48e" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.729882 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c996a30-f53d-49f1-a7d1-2ca23704b48e" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.729932 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa302a57-5c6b-41b1-ac4b-7d9095b7b65a" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.729979 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa302a57-5c6b-41b1-ac4b-7d9095b7b65a" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.730034 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.730086 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: E0129 17:04:59.730144 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e489f203-c94a-4bbb-b22a-750bec963d77" containerName="ovn-config" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.730192 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e489f203-c94a-4bbb-b22a-750bec963d77" containerName="ovn-config" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.730461 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c16915-30cc-4a4f-81ff-4b82cf152968" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.730528 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa302a57-5c6b-41b1-ac4b-7d9095b7b65a" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.730607 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e489f203-c94a-4bbb-b22a-750bec963d77" containerName="ovn-config" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.730677 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.730743 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b696cd6b-840b-4505-9010-114d223a90e9" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.730828 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.731090 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6479af73-81ef-4755-89b5-3a2dd44e99b3" containerName="mariadb-database-create" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.731164 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c996a30-f53d-49f1-a7d1-2ca23704b48e" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.731239 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="29921ec8-f68f-4547-a2c0-d4d3f5de6960" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.731311 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b69834e-55cc-4ec2-b451-fafe1f417c53" containerName="mariadb-account-create-update" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.732194 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.745087 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-thqn5"] Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.745801 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.746121 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cpfdg" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.786666 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c7r8\" (UniqueName: \"kubernetes.io/projected/9f114908-5594-4378-939f-f54b2157d676-kube-api-access-6c7r8\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.786813 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-combined-ca-bundle\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.786913 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-config-data\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.787018 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-db-sync-config-data\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.889167 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c7r8\" (UniqueName: \"kubernetes.io/projected/9f114908-5594-4378-939f-f54b2157d676-kube-api-access-6c7r8\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.889258 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-combined-ca-bundle\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.889388 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-config-data\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.889459 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-db-sync-config-data\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.894859 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-db-sync-config-data\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.895383 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-config-data\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.903720 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-combined-ca-bundle\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:04:59 crc kubenswrapper[4886]: I0129 17:04:59.915951 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c7r8\" (UniqueName: \"kubernetes.io/projected/9f114908-5594-4378-939f-f54b2157d676-kube-api-access-6c7r8\") pod \"glance-db-sync-thqn5\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " pod="openstack/glance-db-sync-thqn5" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.049567 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-thqn5" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.471576 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xg8wq"] Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.474793 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.477870 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.489378 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xg8wq"] Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.608182 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b94c98-0561-4135-a5af-023ef5f4ad67-operator-scripts\") pod \"root-account-create-update-xg8wq\" (UID: \"40b94c98-0561-4135-a5af-023ef5f4ad67\") " pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.608435 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbvff\" (UniqueName: \"kubernetes.io/projected/40b94c98-0561-4135-a5af-023ef5f4ad67-kube-api-access-hbvff\") pod \"root-account-create-update-xg8wq\" (UID: \"40b94c98-0561-4135-a5af-023ef5f4ad67\") " pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.710667 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbvff\" (UniqueName: \"kubernetes.io/projected/40b94c98-0561-4135-a5af-023ef5f4ad67-kube-api-access-hbvff\") pod \"root-account-create-update-xg8wq\" (UID: \"40b94c98-0561-4135-a5af-023ef5f4ad67\") " pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.710914 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b94c98-0561-4135-a5af-023ef5f4ad67-operator-scripts\") pod \"root-account-create-update-xg8wq\" (UID: \"40b94c98-0561-4135-a5af-023ef5f4ad67\") " pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.711893 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b94c98-0561-4135-a5af-023ef5f4ad67-operator-scripts\") pod \"root-account-create-update-xg8wq\" (UID: \"40b94c98-0561-4135-a5af-023ef5f4ad67\") " pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.736801 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbvff\" (UniqueName: \"kubernetes.io/projected/40b94c98-0561-4135-a5af-023ef5f4ad67-kube-api-access-hbvff\") pod \"root-account-create-update-xg8wq\" (UID: \"40b94c98-0561-4135-a5af-023ef5f4ad67\") " pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.739395 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-thqn5"] Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.769021 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-thqn5" event={"ID":"9f114908-5594-4378-939f-f54b2157d676","Type":"ContainerStarted","Data":"fcc8bbf40553cde9c2b386443b55115feca44b41f5cbd715334aa7b1506eef78"} Jan 29 17:05:00 crc kubenswrapper[4886]: I0129 17:05:00.793525 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.377401 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xg8wq"] Jan 29 17:05:01 crc kubenswrapper[4886]: W0129 17:05:01.378180 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40b94c98_0561_4135_a5af_023ef5f4ad67.slice/crio-7681989d41c3df63a9cfe16c457a7c04de933a5c485b9a6a131f7473a305fd74 WatchSource:0}: Error finding container 7681989d41c3df63a9cfe16c457a7c04de933a5c485b9a6a131f7473a305fd74: Status 404 returned error can't find the container with id 7681989d41c3df63a9cfe16c457a7c04de933a5c485b9a6a131f7473a305fd74 Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.784411 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xg8wq" event={"ID":"40b94c98-0561-4135-a5af-023ef5f4ad67","Type":"ContainerStarted","Data":"2e89a5a701ca89a4fedcbc0c8d956d6d340377591f80cf75f3cdedc6fb2cd6f3"} Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.784462 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xg8wq" event={"ID":"40b94c98-0561-4135-a5af-023ef5f4ad67","Type":"ContainerStarted","Data":"7681989d41c3df63a9cfe16c457a7c04de933a5c485b9a6a131f7473a305fd74"} Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.788270 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4"] Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.790028 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.811679 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4"] Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.824436 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-xg8wq" podStartSLOduration=1.824408045 podStartE2EDuration="1.824408045s" podCreationTimestamp="2026-01-29 17:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:01.806108441 +0000 UTC m=+2584.714827713" watchObservedRunningTime="2026-01-29 17:05:01.824408045 +0000 UTC m=+2584.733127317" Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.945837 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sl5h4\" (UID: \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.945928 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gng45\" (UniqueName: \"kubernetes.io/projected/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-kube-api-access-gng45\") pod \"mysqld-exporter-openstack-cell1-db-create-sl5h4\" (UID: \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.995030 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-23ad-account-create-update-2dsmj"] Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.996614 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:01 crc kubenswrapper[4886]: I0129 17:05:01.999132 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.005748 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-23ad-account-create-update-2dsmj"] Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.047905 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sl5h4\" (UID: \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.047974 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gng45\" (UniqueName: \"kubernetes.io/projected/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-kube-api-access-gng45\") pod \"mysqld-exporter-openstack-cell1-db-create-sl5h4\" (UID: \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.051652 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sl5h4\" (UID: \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.067019 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gng45\" (UniqueName: \"kubernetes.io/projected/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-kube-api-access-gng45\") pod \"mysqld-exporter-openstack-cell1-db-create-sl5h4\" (UID: \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.109233 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.151097 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ed1f90-1318-483e-901c-bff80e1e94b6-operator-scripts\") pod \"mysqld-exporter-23ad-account-create-update-2dsmj\" (UID: \"d2ed1f90-1318-483e-901c-bff80e1e94b6\") " pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.151437 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b24b5\" (UniqueName: \"kubernetes.io/projected/d2ed1f90-1318-483e-901c-bff80e1e94b6-kube-api-access-b24b5\") pod \"mysqld-exporter-23ad-account-create-update-2dsmj\" (UID: \"d2ed1f90-1318-483e-901c-bff80e1e94b6\") " pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.253422 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b24b5\" (UniqueName: \"kubernetes.io/projected/d2ed1f90-1318-483e-901c-bff80e1e94b6-kube-api-access-b24b5\") pod \"mysqld-exporter-23ad-account-create-update-2dsmj\" (UID: \"d2ed1f90-1318-483e-901c-bff80e1e94b6\") " pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.253894 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ed1f90-1318-483e-901c-bff80e1e94b6-operator-scripts\") pod \"mysqld-exporter-23ad-account-create-update-2dsmj\" (UID: \"d2ed1f90-1318-483e-901c-bff80e1e94b6\") " pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.254711 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ed1f90-1318-483e-901c-bff80e1e94b6-operator-scripts\") pod \"mysqld-exporter-23ad-account-create-update-2dsmj\" (UID: \"d2ed1f90-1318-483e-901c-bff80e1e94b6\") " pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.277376 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b24b5\" (UniqueName: \"kubernetes.io/projected/d2ed1f90-1318-483e-901c-bff80e1e94b6-kube-api-access-b24b5\") pod \"mysqld-exporter-23ad-account-create-update-2dsmj\" (UID: \"d2ed1f90-1318-483e-901c-bff80e1e94b6\") " pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.316253 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.465318 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.473164 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.637747 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4"] Jan 29 17:05:02 crc kubenswrapper[4886]: W0129 17:05:02.654382 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8a69a79_4e4c_4815_8cf5_0864ff2b8026.slice/crio-27e604caf8b15942348375a9990e1bf7c1fa6aa35968cf73fc93abd1ac9c4cad WatchSource:0}: Error finding container 27e604caf8b15942348375a9990e1bf7c1fa6aa35968cf73fc93abd1ac9c4cad: Status 404 returned error can't find the container with id 27e604caf8b15942348375a9990e1bf7c1fa6aa35968cf73fc93abd1ac9c4cad Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.829206 4886 generic.go:334] "Generic (PLEG): container finished" podID="40b94c98-0561-4135-a5af-023ef5f4ad67" containerID="2e89a5a701ca89a4fedcbc0c8d956d6d340377591f80cf75f3cdedc6fb2cd6f3" exitCode=0 Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.829709 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xg8wq" event={"ID":"40b94c98-0561-4135-a5af-023ef5f4ad67","Type":"ContainerDied","Data":"2e89a5a701ca89a4fedcbc0c8d956d6d340377591f80cf75f3cdedc6fb2cd6f3"} Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.837641 4886 generic.go:334] "Generic (PLEG): container finished" podID="ebccb3a0-d421-4c30-9201-43e9106e4006" containerID="b9499d28202d4957e50821e930ae2c95870e6ae3730a64237a2f9f54f953765c" exitCode=0 Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.837755 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s7294" event={"ID":"ebccb3a0-d421-4c30-9201-43e9106e4006","Type":"ContainerDied","Data":"b9499d28202d4957e50821e930ae2c95870e6ae3730a64237a2f9f54f953765c"} Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.839833 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" event={"ID":"d8a69a79-4e4c-4815-8cf5-0864ff2b8026","Type":"ContainerStarted","Data":"27e604caf8b15942348375a9990e1bf7c1fa6aa35968cf73fc93abd1ac9c4cad"} Jan 29 17:05:02 crc kubenswrapper[4886]: I0129 17:05:02.932582 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-23ad-account-create-update-2dsmj"] Jan 29 17:05:03 crc kubenswrapper[4886]: I0129 17:05:03.851724 4886 generic.go:334] "Generic (PLEG): container finished" podID="d8a69a79-4e4c-4815-8cf5-0864ff2b8026" containerID="ef7ef7e1c633f815512fbc83adaa9bb46d23ddf73eb8c93c02d1c3c3b64a5fcf" exitCode=0 Jan 29 17:05:03 crc kubenswrapper[4886]: I0129 17:05:03.851772 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" event={"ID":"d8a69a79-4e4c-4815-8cf5-0864ff2b8026","Type":"ContainerDied","Data":"ef7ef7e1c633f815512fbc83adaa9bb46d23ddf73eb8c93c02d1c3c3b64a5fcf"} Jan 29 17:05:03 crc kubenswrapper[4886]: I0129 17:05:03.857149 4886 generic.go:334] "Generic (PLEG): container finished" podID="d2ed1f90-1318-483e-901c-bff80e1e94b6" containerID="d34996a936f771ac75eec769fb4795e0b3637c5867ba052c3b34c2c7b2aee667" exitCode=0 Jan 29 17:05:03 crc kubenswrapper[4886]: I0129 17:05:03.857243 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" event={"ID":"d2ed1f90-1318-483e-901c-bff80e1e94b6","Type":"ContainerDied","Data":"d34996a936f771ac75eec769fb4795e0b3637c5867ba052c3b34c2c7b2aee667"} Jan 29 17:05:03 crc kubenswrapper[4886]: I0129 17:05:03.857297 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" event={"ID":"d2ed1f90-1318-483e-901c-bff80e1e94b6","Type":"ContainerStarted","Data":"563f68afde711b3cca93a3f5d5dbae0e6aee5931cf6f7c5cb99463997cce21b1"} Jan 29 17:05:03 crc kubenswrapper[4886]: I0129 17:05:03.861908 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerStarted","Data":"29b6600206cc1bb7f3f16719ec90e5544c72d2eaf5a596eaa0dcf19be615c898"} Jan 29 17:05:03 crc kubenswrapper[4886]: I0129 17:05:03.862882 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:03 crc kubenswrapper[4886]: I0129 17:05:03.921583 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=63.471724184 podStartE2EDuration="2m32.921560895s" podCreationTimestamp="2026-01-29 17:02:31 +0000 UTC" firstStartedPulling="2026-01-29 17:03:34.193704497 +0000 UTC m=+2497.102423769" lastFinishedPulling="2026-01-29 17:05:03.643541208 +0000 UTC m=+2586.552260480" observedRunningTime="2026-01-29 17:05:03.920964979 +0000 UTC m=+2586.829684251" watchObservedRunningTime="2026-01-29 17:05:03.921560895 +0000 UTC m=+2586.830280177" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.405236 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.524967 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-dispersionconf\") pod \"ebccb3a0-d421-4c30-9201-43e9106e4006\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.525008 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-swiftconf\") pod \"ebccb3a0-d421-4c30-9201-43e9106e4006\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.525050 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km9gr\" (UniqueName: \"kubernetes.io/projected/ebccb3a0-d421-4c30-9201-43e9106e4006-kube-api-access-km9gr\") pod \"ebccb3a0-d421-4c30-9201-43e9106e4006\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.525097 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-ring-data-devices\") pod \"ebccb3a0-d421-4c30-9201-43e9106e4006\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.525637 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ebccb3a0-d421-4c30-9201-43e9106e4006" (UID: "ebccb3a0-d421-4c30-9201-43e9106e4006"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.526309 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-scripts\") pod \"ebccb3a0-d421-4c30-9201-43e9106e4006\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.526358 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-combined-ca-bundle\") pod \"ebccb3a0-d421-4c30-9201-43e9106e4006\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.526431 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebccb3a0-d421-4c30-9201-43e9106e4006-etc-swift\") pod \"ebccb3a0-d421-4c30-9201-43e9106e4006\" (UID: \"ebccb3a0-d421-4c30-9201-43e9106e4006\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.527108 4886 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.528030 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebccb3a0-d421-4c30-9201-43e9106e4006-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ebccb3a0-d421-4c30-9201-43e9106e4006" (UID: "ebccb3a0-d421-4c30-9201-43e9106e4006"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.533746 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebccb3a0-d421-4c30-9201-43e9106e4006-kube-api-access-km9gr" (OuterVolumeSpecName: "kube-api-access-km9gr") pod "ebccb3a0-d421-4c30-9201-43e9106e4006" (UID: "ebccb3a0-d421-4c30-9201-43e9106e4006"). InnerVolumeSpecName "kube-api-access-km9gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.536847 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ebccb3a0-d421-4c30-9201-43e9106e4006" (UID: "ebccb3a0-d421-4c30-9201-43e9106e4006"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.552243 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-scripts" (OuterVolumeSpecName: "scripts") pod "ebccb3a0-d421-4c30-9201-43e9106e4006" (UID: "ebccb3a0-d421-4c30-9201-43e9106e4006"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.563502 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ebccb3a0-d421-4c30-9201-43e9106e4006" (UID: "ebccb3a0-d421-4c30-9201-43e9106e4006"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.575454 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebccb3a0-d421-4c30-9201-43e9106e4006" (UID: "ebccb3a0-d421-4c30-9201-43e9106e4006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.593849 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.628942 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebccb3a0-d421-4c30-9201-43e9106e4006-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.628981 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.629030 4886 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebccb3a0-d421-4c30-9201-43e9106e4006-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.629046 4886 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.629058 4886 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebccb3a0-d421-4c30-9201-43e9106e4006-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.629071 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km9gr\" (UniqueName: \"kubernetes.io/projected/ebccb3a0-d421-4c30-9201-43e9106e4006-kube-api-access-km9gr\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.730800 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbvff\" (UniqueName: \"kubernetes.io/projected/40b94c98-0561-4135-a5af-023ef5f4ad67-kube-api-access-hbvff\") pod \"40b94c98-0561-4135-a5af-023ef5f4ad67\" (UID: \"40b94c98-0561-4135-a5af-023ef5f4ad67\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.730954 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b94c98-0561-4135-a5af-023ef5f4ad67-operator-scripts\") pod \"40b94c98-0561-4135-a5af-023ef5f4ad67\" (UID: \"40b94c98-0561-4135-a5af-023ef5f4ad67\") " Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.733151 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b94c98-0561-4135-a5af-023ef5f4ad67-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40b94c98-0561-4135-a5af-023ef5f4ad67" (UID: "40b94c98-0561-4135-a5af-023ef5f4ad67"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.735200 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b94c98-0561-4135-a5af-023ef5f4ad67-kube-api-access-hbvff" (OuterVolumeSpecName: "kube-api-access-hbvff") pod "40b94c98-0561-4135-a5af-023ef5f4ad67" (UID: "40b94c98-0561-4135-a5af-023ef5f4ad67"). InnerVolumeSpecName "kube-api-access-hbvff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.834152 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbvff\" (UniqueName: \"kubernetes.io/projected/40b94c98-0561-4135-a5af-023ef5f4ad67-kube-api-access-hbvff\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.834198 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b94c98-0561-4135-a5af-023ef5f4ad67-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.876828 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xg8wq" event={"ID":"40b94c98-0561-4135-a5af-023ef5f4ad67","Type":"ContainerDied","Data":"7681989d41c3df63a9cfe16c457a7c04de933a5c485b9a6a131f7473a305fd74"} Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.876867 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7681989d41c3df63a9cfe16c457a7c04de933a5c485b9a6a131f7473a305fd74" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.876918 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xg8wq" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.884693 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s7294" event={"ID":"ebccb3a0-d421-4c30-9201-43e9106e4006","Type":"ContainerDied","Data":"b1f9445ba0ed2622eaf729acf0f6efe1278fbfe9cc96bab1babb0686d7460824"} Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.884749 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1f9445ba0ed2622eaf729acf0f6efe1278fbfe9cc96bab1babb0686d7460824" Jan 29 17:05:04 crc kubenswrapper[4886]: I0129 17:05:04.884934 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s7294" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.248563 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.297597 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2b0be43b-8956-45aa-ad50-de9183b3fea3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.144:5671: connect: connection refused" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.356789 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-operator-scripts\") pod \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\" (UID: \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\") " Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.356998 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gng45\" (UniqueName: \"kubernetes.io/projected/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-kube-api-access-gng45\") pod \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\" (UID: \"d8a69a79-4e4c-4815-8cf5-0864ff2b8026\") " Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.367226 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8a69a79-4e4c-4815-8cf5-0864ff2b8026" (UID: "d8a69a79-4e4c-4815-8cf5-0864ff2b8026"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.369268 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-kube-api-access-gng45" (OuterVolumeSpecName: "kube-api-access-gng45") pod "d8a69a79-4e4c-4815-8cf5-0864ff2b8026" (UID: "d8a69a79-4e4c-4815-8cf5-0864ff2b8026"). InnerVolumeSpecName "kube-api-access-gng45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.432846 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.444409 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.146:5671: connect: connection refused" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.470077 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.470145 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gng45\" (UniqueName: \"kubernetes.io/projected/d8a69a79-4e4c-4815-8cf5-0864ff2b8026-kube-api-access-gng45\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.571184 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b24b5\" (UniqueName: \"kubernetes.io/projected/d2ed1f90-1318-483e-901c-bff80e1e94b6-kube-api-access-b24b5\") pod \"d2ed1f90-1318-483e-901c-bff80e1e94b6\" (UID: \"d2ed1f90-1318-483e-901c-bff80e1e94b6\") " Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.571473 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ed1f90-1318-483e-901c-bff80e1e94b6-operator-scripts\") pod \"d2ed1f90-1318-483e-901c-bff80e1e94b6\" (UID: \"d2ed1f90-1318-483e-901c-bff80e1e94b6\") " Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.571898 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ed1f90-1318-483e-901c-bff80e1e94b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2ed1f90-1318-483e-901c-bff80e1e94b6" (UID: "d2ed1f90-1318-483e-901c-bff80e1e94b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.576852 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ed1f90-1318-483e-901c-bff80e1e94b6-kube-api-access-b24b5" (OuterVolumeSpecName: "kube-api-access-b24b5") pod "d2ed1f90-1318-483e-901c-bff80e1e94b6" (UID: "d2ed1f90-1318-483e-901c-bff80e1e94b6"). InnerVolumeSpecName "kube-api-access-b24b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.637421 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="842bfe4d-04ba-4143-9076-3033163c7b82" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.145:5671: connect: connection refused" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.674316 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ed1f90-1318-483e-901c-bff80e1e94b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.674366 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b24b5\" (UniqueName: \"kubernetes.io/projected/d2ed1f90-1318-483e-901c-bff80e1e94b6-kube-api-access-b24b5\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.896263 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" event={"ID":"d8a69a79-4e4c-4815-8cf5-0864ff2b8026","Type":"ContainerDied","Data":"27e604caf8b15942348375a9990e1bf7c1fa6aa35968cf73fc93abd1ac9c4cad"} Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.896349 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e604caf8b15942348375a9990e1bf7c1fa6aa35968cf73fc93abd1ac9c4cad" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.896367 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.900987 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.902731 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-23ad-account-create-update-2dsmj" event={"ID":"d2ed1f90-1318-483e-901c-bff80e1e94b6","Type":"ContainerDied","Data":"563f68afde711b3cca93a3f5d5dbae0e6aee5931cf6f7c5cb99463997cce21b1"} Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.902855 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563f68afde711b3cca93a3f5d5dbae0e6aee5931cf6f7c5cb99463997cce21b1" Jan 29 17:05:05 crc kubenswrapper[4886]: I0129 17:05:05.970773 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 29 17:05:06 crc kubenswrapper[4886]: I0129 17:05:06.395519 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:05:06 crc kubenswrapper[4886]: I0129 17:05:06.910209 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="config-reloader" containerID="cri-o://36870feb46aff15218a1df0a6e9d4aa854998ebadaa74a5a50b3e39905ffbc8c" gracePeriod=600 Jan 29 17:05:06 crc kubenswrapper[4886]: I0129 17:05:06.910384 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="prometheus" containerID="cri-o://3a9c53d5227fb7b0c6bf2e7197762b1a4d147cab6dde0f951e7924a558b5e58d" gracePeriod=600 Jan 29 17:05:06 crc kubenswrapper[4886]: I0129 17:05:06.910385 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="thanos-sidecar" containerID="cri-o://29b6600206cc1bb7f3f16719ec90e5544c72d2eaf5a596eaa0dcf19be615c898" gracePeriod=600 Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.304960 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:05:07 crc kubenswrapper[4886]: E0129 17:05:07.305483 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebccb3a0-d421-4c30-9201-43e9106e4006" containerName="swift-ring-rebalance" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.305503 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebccb3a0-d421-4c30-9201-43e9106e4006" containerName="swift-ring-rebalance" Jan 29 17:05:07 crc kubenswrapper[4886]: E0129 17:05:07.305519 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a69a79-4e4c-4815-8cf5-0864ff2b8026" containerName="mariadb-database-create" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.305525 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a69a79-4e4c-4815-8cf5-0864ff2b8026" containerName="mariadb-database-create" Jan 29 17:05:07 crc kubenswrapper[4886]: E0129 17:05:07.305547 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ed1f90-1318-483e-901c-bff80e1e94b6" containerName="mariadb-account-create-update" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.305555 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ed1f90-1318-483e-901c-bff80e1e94b6" containerName="mariadb-account-create-update" Jan 29 17:05:07 crc kubenswrapper[4886]: E0129 17:05:07.305576 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b94c98-0561-4135-a5af-023ef5f4ad67" containerName="mariadb-account-create-update" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.305582 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b94c98-0561-4135-a5af-023ef5f4ad67" containerName="mariadb-account-create-update" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.305817 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebccb3a0-d421-4c30-9201-43e9106e4006" containerName="swift-ring-rebalance" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.305838 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b94c98-0561-4135-a5af-023ef5f4ad67" containerName="mariadb-account-create-update" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.305853 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a69a79-4e4c-4815-8cf5-0864ff2b8026" containerName="mariadb-database-create" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.305871 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ed1f90-1318-483e-901c-bff80e1e94b6" containerName="mariadb-account-create-update" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.306744 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.309936 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.317779 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.415727 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v52w\" (UniqueName: \"kubernetes.io/projected/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-kube-api-access-5v52w\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.416261 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.416535 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-config-data\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.463530 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.154:9090/-/ready\": dial tcp 10.217.0.154:9090: connect: connection refused" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.518805 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v52w\" (UniqueName: \"kubernetes.io/projected/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-kube-api-access-5v52w\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.518936 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.519023 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-config-data\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.526493 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-config-data\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.526511 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.537675 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v52w\" (UniqueName: \"kubernetes.io/projected/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-kube-api-access-5v52w\") pod \"mysqld-exporter-0\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.615850 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:05:07 crc kubenswrapper[4886]: E0129 17:05:07.616238 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.672203 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.926412 4886 generic.go:334] "Generic (PLEG): container finished" podID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerID="29b6600206cc1bb7f3f16719ec90e5544c72d2eaf5a596eaa0dcf19be615c898" exitCode=0 Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.926837 4886 generic.go:334] "Generic (PLEG): container finished" podID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerID="3a9c53d5227fb7b0c6bf2e7197762b1a4d147cab6dde0f951e7924a558b5e58d" exitCode=0 Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.926848 4886 generic.go:334] "Generic (PLEG): container finished" podID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerID="36870feb46aff15218a1df0a6e9d4aa854998ebadaa74a5a50b3e39905ffbc8c" exitCode=0 Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.926872 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerDied","Data":"29b6600206cc1bb7f3f16719ec90e5544c72d2eaf5a596eaa0dcf19be615c898"} Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.926921 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerDied","Data":"3a9c53d5227fb7b0c6bf2e7197762b1a4d147cab6dde0f951e7924a558b5e58d"} Jan 29 17:05:07 crc kubenswrapper[4886]: I0129 17:05:07.926938 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerDied","Data":"36870feb46aff15218a1df0a6e9d4aa854998ebadaa74a5a50b3e39905ffbc8c"} Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.190239 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.662076 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.750739 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.750801 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce7955a1-eb58-425a-872a-7ec102b8e090-config-out\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.750871 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-2\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.750904 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-0\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.750956 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-config\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.750985 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-tls-assets\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.751000 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-1\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.751039 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-web-config\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.751189 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2cnt\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-kube-api-access-w2cnt\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.751283 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-thanos-prometheus-http-client-file\") pod \"ce7955a1-eb58-425a-872a-7ec102b8e090\" (UID: \"ce7955a1-eb58-425a-872a-7ec102b8e090\") " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.752163 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.752175 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.753449 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.756854 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce7955a1-eb58-425a-872a-7ec102b8e090-config-out" (OuterVolumeSpecName: "config-out") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.757671 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.760273 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.774193 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-kube-api-access-w2cnt" (OuterVolumeSpecName: "kube-api-access-w2cnt") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "kube-api-access-w2cnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.797418 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-web-config" (OuterVolumeSpecName: "web-config") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.799129 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-config" (OuterVolumeSpecName: "config") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.805893 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "ce7955a1-eb58-425a-872a-7ec102b8e090" (UID: "ce7955a1-eb58-425a-872a-7ec102b8e090"). InnerVolumeSpecName "pvc-68e86941-9560-4703-a0e6-50bee25f62a0". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853755 4886 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-web-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853787 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2cnt\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-kube-api-access-w2cnt\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853799 4886 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853853 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") on node \"crc\" " Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853864 4886 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce7955a1-eb58-425a-872a-7ec102b8e090-config-out\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853874 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853884 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853894 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce7955a1-eb58-425a-872a-7ec102b8e090-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853902 4886 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce7955a1-eb58-425a-872a-7ec102b8e090-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.853910 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ce7955a1-eb58-425a-872a-7ec102b8e090-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.880853 4886 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.884068 4886 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-68e86941-9560-4703-a0e6-50bee25f62a0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0") on node "crc" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.940771 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e","Type":"ContainerStarted","Data":"a4b442eb660a759ea9b06148625ca4e079373c7e47cea96d0478208100ae22a9"} Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.947841 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce7955a1-eb58-425a-872a-7ec102b8e090","Type":"ContainerDied","Data":"38705f04f0f2e20b7f5d72009f437278994e72d7c6d255707ef36ddaf6f80953"} Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.947899 4886 scope.go:117] "RemoveContainer" containerID="29b6600206cc1bb7f3f16719ec90e5544c72d2eaf5a596eaa0dcf19be615c898" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.948062 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.964116 4886 reconciler_common.go:293] "Volume detached for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:08 crc kubenswrapper[4886]: I0129 17:05:08.991081 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.015546 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.015810 4886 scope.go:117] "RemoveContainer" containerID="3a9c53d5227fb7b0c6bf2e7197762b1a4d147cab6dde0f951e7924a558b5e58d" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.059402 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:05:09 crc kubenswrapper[4886]: E0129 17:05:09.060017 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="thanos-sidecar" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.060037 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="thanos-sidecar" Jan 29 17:05:09 crc kubenswrapper[4886]: E0129 17:05:09.060075 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="config-reloader" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.060083 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="config-reloader" Jan 29 17:05:09 crc kubenswrapper[4886]: E0129 17:05:09.060169 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="init-config-reloader" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.060180 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="init-config-reloader" Jan 29 17:05:09 crc kubenswrapper[4886]: E0129 17:05:09.060199 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="prometheus" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.060206 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="prometheus" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.060408 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="config-reloader" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.060428 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="thanos-sidecar" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.060438 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" containerName="prometheus" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.062476 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.066985 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gbmnx" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.070606 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.071184 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.071351 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.071706 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.071828 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.071981 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.072077 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.072140 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.076672 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.086805 4886 scope.go:117] "RemoveContainer" containerID="36870feb46aff15218a1df0a6e9d4aa854998ebadaa74a5a50b3e39905ffbc8c" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.155717 4886 scope.go:117] "RemoveContainer" containerID="583c2c73cc1b55ad9f4f022652302dc10ae77e94e45a693b0865ff8b717978ab" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.168914 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169011 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169060 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169105 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169133 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-config\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169178 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169297 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169348 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169463 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24b4\" (UniqueName: \"kubernetes.io/projected/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-kube-api-access-m24b4\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169496 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169528 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169640 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.169732 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272168 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272207 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272226 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-config\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272260 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272318 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272354 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272393 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24b4\" (UniqueName: \"kubernetes.io/projected/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-kube-api-access-m24b4\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272413 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272438 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272472 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272490 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.272522 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.273298 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.276835 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-config\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.276843 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.277551 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.277667 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.277935 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.277956 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5b5b0b1c62be5d324bfe10f676e08a70a611b72b2c99a9227275ea9ec17aa7e0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.279381 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.282402 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.283066 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.284696 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.287764 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.290055 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24b4\" (UniqueName: \"kubernetes.io/projected/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-kube-api-access-m24b4\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.293565 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8b3a2d6b-4eb5-44a2-837b-cfbe63f07107-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.315911 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-68e86941-9560-4703-a0e6-50bee25f62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68e86941-9560-4703-a0e6-50bee25f62a0\") pod \"prometheus-metric-storage-0\" (UID: \"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107\") " pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:09 crc kubenswrapper[4886]: I0129 17:05:09.407423 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:10 crc kubenswrapper[4886]: I0129 17:05:10.631853 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce7955a1-eb58-425a-872a-7ec102b8e090" path="/var/lib/kubelet/pods/ce7955a1-eb58-425a-872a-7ec102b8e090/volumes" Jan 29 17:05:10 crc kubenswrapper[4886]: I0129 17:05:10.771174 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 17:05:15 crc kubenswrapper[4886]: I0129 17:05:15.298595 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2b0be43b-8956-45aa-ad50-de9183b3fea3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.144:5671: connect: connection refused" Jan 29 17:05:15 crc kubenswrapper[4886]: I0129 17:05:15.444870 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.146:5671: connect: connection refused" Jan 29 17:05:15 crc kubenswrapper[4886]: I0129 17:05:15.638085 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Jan 29 17:05:16 crc kubenswrapper[4886]: I0129 17:05:16.329704 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:05:16 crc kubenswrapper[4886]: I0129 17:05:16.336361 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e2f2c6c-bc32-4a32-ba2c-8954d277ce47-etc-swift\") pod \"swift-storage-0\" (UID: \"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47\") " pod="openstack/swift-storage-0" Jan 29 17:05:16 crc kubenswrapper[4886]: I0129 17:05:16.475426 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 29 17:05:17 crc kubenswrapper[4886]: I0129 17:05:17.041363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107","Type":"ContainerStarted","Data":"008507624c5e459bffcfe3745d6841d3a84f99cc269885679b7c9e83134281c5"} Jan 29 17:05:19 crc kubenswrapper[4886]: I0129 17:05:19.519721 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 29 17:05:19 crc kubenswrapper[4886]: W0129 17:05:19.524199 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e2f2c6c_bc32_4a32_ba2c_8954d277ce47.slice/crio-251f09491fb6f3211881c60459fd725a28b397882e9bf117072fb4445ff00e03 WatchSource:0}: Error finding container 251f09491fb6f3211881c60459fd725a28b397882e9bf117072fb4445ff00e03: Status 404 returned error can't find the container with id 251f09491fb6f3211881c60459fd725a28b397882e9bf117072fb4445ff00e03 Jan 29 17:05:20 crc kubenswrapper[4886]: I0129 17:05:20.076071 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-thqn5" event={"ID":"9f114908-5594-4378-939f-f54b2157d676","Type":"ContainerStarted","Data":"76e9fd9551f88713599d793f819bec47fc38185510d47fbd152e0939943ac037"} Jan 29 17:05:20 crc kubenswrapper[4886]: I0129 17:05:20.077796 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e","Type":"ContainerStarted","Data":"2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66"} Jan 29 17:05:20 crc kubenswrapper[4886]: I0129 17:05:20.080282 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"251f09491fb6f3211881c60459fd725a28b397882e9bf117072fb4445ff00e03"} Jan 29 17:05:20 crc kubenswrapper[4886]: I0129 17:05:20.103604 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-thqn5" podStartSLOduration=2.888788242 podStartE2EDuration="21.103587655s" podCreationTimestamp="2026-01-29 17:04:59 +0000 UTC" firstStartedPulling="2026-01-29 17:05:00.742956749 +0000 UTC m=+2583.651676021" lastFinishedPulling="2026-01-29 17:05:18.957756122 +0000 UTC m=+2601.866475434" observedRunningTime="2026-01-29 17:05:20.095774455 +0000 UTC m=+2603.004493737" watchObservedRunningTime="2026-01-29 17:05:20.103587655 +0000 UTC m=+2603.012306927" Jan 29 17:05:20 crc kubenswrapper[4886]: I0129 17:05:20.124845 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.391488973 podStartE2EDuration="13.124807173s" podCreationTimestamp="2026-01-29 17:05:07 +0000 UTC" firstStartedPulling="2026-01-29 17:05:08.224426951 +0000 UTC m=+2591.133146223" lastFinishedPulling="2026-01-29 17:05:18.957745131 +0000 UTC m=+2601.866464423" observedRunningTime="2026-01-29 17:05:20.114821381 +0000 UTC m=+2603.023540663" watchObservedRunningTime="2026-01-29 17:05:20.124807173 +0000 UTC m=+2603.033526495" Jan 29 17:05:22 crc kubenswrapper[4886]: I0129 17:05:22.615166 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:05:22 crc kubenswrapper[4886]: E0129 17:05:22.615893 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:05:23 crc kubenswrapper[4886]: I0129 17:05:23.116684 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107","Type":"ContainerStarted","Data":"de7dada2ef19babe3f5199b8971a1952c603cdf7fc481479b9ab0e7054f6362b"} Jan 29 17:05:25 crc kubenswrapper[4886]: I0129 17:05:25.299648 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 29 17:05:25 crc kubenswrapper[4886]: I0129 17:05:25.444234 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.146:5671: connect: connection refused" Jan 29 17:05:26 crc kubenswrapper[4886]: I0129 17:05:26.166182 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"91dfedcd84ac1fcfc4233d9c608ed66798ca8b2cc395de0a7cfa1a84b6ad0b93"} Jan 29 17:05:27 crc kubenswrapper[4886]: I0129 17:05:27.180433 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"210093098e70027ca0511a925eb7d3f4d788705183245a1fed785c07c0db8d0c"} Jan 29 17:05:28 crc kubenswrapper[4886]: I0129 17:05:28.198694 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"675fd086dcbf76788ef35301c27eec88d51392bbe6d2527c9b8247b18b6bedc8"} Jan 29 17:05:32 crc kubenswrapper[4886]: I0129 17:05:32.237297 4886 generic.go:334] "Generic (PLEG): container finished" podID="8b3a2d6b-4eb5-44a2-837b-cfbe63f07107" containerID="de7dada2ef19babe3f5199b8971a1952c603cdf7fc481479b9ab0e7054f6362b" exitCode=0 Jan 29 17:05:32 crc kubenswrapper[4886]: I0129 17:05:32.237363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107","Type":"ContainerDied","Data":"de7dada2ef19babe3f5199b8971a1952c603cdf7fc481479b9ab0e7054f6362b"} Jan 29 17:05:32 crc kubenswrapper[4886]: I0129 17:05:32.241363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"4bf82f389eecacebff2da62d86dc9ced9849a658b1bb5c3ad10e05ed2b182877"} Jan 29 17:05:33 crc kubenswrapper[4886]: I0129 17:05:33.266726 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107","Type":"ContainerStarted","Data":"04d25e6f1ec09cdc59b613cf64cd249765d99f850dfac14010aa9c2703547555"} Jan 29 17:05:34 crc kubenswrapper[4886]: I0129 17:05:34.615877 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:05:35 crc kubenswrapper[4886]: I0129 17:05:35.288905 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"db3893b2fd9096a13f5744612d4a2bcbba80c7ed2ddb6ffa1307348c351b1963"} Jan 29 17:05:35 crc kubenswrapper[4886]: I0129 17:05:35.448568 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Jan 29 17:05:35 crc kubenswrapper[4886]: I0129 17:05:35.978760 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-b8qfq"] Jan 29 17:05:35 crc kubenswrapper[4886]: I0129 17:05:35.980487 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:35 crc kubenswrapper[4886]: I0129 17:05:35.987705 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-b8qfq"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.078119 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrmtg\" (UniqueName: \"kubernetes.io/projected/219e979e-b3a8-42d0-8f23-737a86a2aefb-kube-api-access-qrmtg\") pod \"heat-db-create-b8qfq\" (UID: \"219e979e-b3a8-42d0-8f23-737a86a2aefb\") " pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.078281 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219e979e-b3a8-42d0-8f23-737a86a2aefb-operator-scripts\") pod \"heat-db-create-b8qfq\" (UID: \"219e979e-b3a8-42d0-8f23-737a86a2aefb\") " pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.091019 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5m27f"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.092685 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.165198 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5m27f"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.180769 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrmtg\" (UniqueName: \"kubernetes.io/projected/219e979e-b3a8-42d0-8f23-737a86a2aefb-kube-api-access-qrmtg\") pod \"heat-db-create-b8qfq\" (UID: \"219e979e-b3a8-42d0-8f23-737a86a2aefb\") " pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.180954 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219e979e-b3a8-42d0-8f23-737a86a2aefb-operator-scripts\") pod \"heat-db-create-b8qfq\" (UID: \"219e979e-b3a8-42d0-8f23-737a86a2aefb\") " pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.181018 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca25333-29b2-4c38-9e85-ebd2a0d593d6-operator-scripts\") pod \"cinder-db-create-5m27f\" (UID: \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\") " pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.181071 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zm4\" (UniqueName: \"kubernetes.io/projected/eca25333-29b2-4c38-9e85-ebd2a0d593d6-kube-api-access-c8zm4\") pod \"cinder-db-create-5m27f\" (UID: \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\") " pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.182488 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219e979e-b3a8-42d0-8f23-737a86a2aefb-operator-scripts\") pod \"heat-db-create-b8qfq\" (UID: \"219e979e-b3a8-42d0-8f23-737a86a2aefb\") " pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.211080 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrmtg\" (UniqueName: \"kubernetes.io/projected/219e979e-b3a8-42d0-8f23-737a86a2aefb-kube-api-access-qrmtg\") pod \"heat-db-create-b8qfq\" (UID: \"219e979e-b3a8-42d0-8f23-737a86a2aefb\") " pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.243168 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-bd38-account-create-update-rgmr5"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.251626 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.261611 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.268742 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bd38-account-create-update-rgmr5"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.283670 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca25333-29b2-4c38-9e85-ebd2a0d593d6-operator-scripts\") pod \"cinder-db-create-5m27f\" (UID: \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\") " pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.283765 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zm4\" (UniqueName: \"kubernetes.io/projected/eca25333-29b2-4c38-9e85-ebd2a0d593d6-kube-api-access-c8zm4\") pod \"cinder-db-create-5m27f\" (UID: \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\") " pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.284549 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca25333-29b2-4c38-9e85-ebd2a0d593d6-operator-scripts\") pod \"cinder-db-create-5m27f\" (UID: \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\") " pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.298412 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107","Type":"ContainerStarted","Data":"1f127eef1f75d009bfa88d892080ac8076b5b396ae14658ea85d8c93fccd374f"} Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.315929 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.325761 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zm4\" (UniqueName: \"kubernetes.io/projected/eca25333-29b2-4c38-9e85-ebd2a0d593d6-kube-api-access-c8zm4\") pod \"cinder-db-create-5m27f\" (UID: \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\") " pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.384915 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vvrp4"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.386136 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-operator-scripts\") pod \"cinder-bd38-account-create-update-rgmr5\" (UID: \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\") " pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.386234 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lv2x\" (UniqueName: \"kubernetes.io/projected/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-kube-api-access-6lv2x\") pod \"cinder-bd38-account-create-update-rgmr5\" (UID: \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\") " pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.386287 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.398028 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-70c1-account-create-update-gwzzv"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.399644 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.402368 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.413402 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.450025 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vvrp4"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.466993 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-70c1-account-create-update-gwzzv"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.488027 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtgf\" (UniqueName: \"kubernetes.io/projected/2b3dc785-5f55-49ca-8678-5105ba7e0568-kube-api-access-lhtgf\") pod \"barbican-70c1-account-create-update-gwzzv\" (UID: \"2b3dc785-5f55-49ca-8678-5105ba7e0568\") " pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.488108 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-operator-scripts\") pod \"cinder-bd38-account-create-update-rgmr5\" (UID: \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\") " pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.488164 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61eedb40-ed14-42aa-9751-8bedcd699260-operator-scripts\") pod \"barbican-db-create-vvrp4\" (UID: \"61eedb40-ed14-42aa-9751-8bedcd699260\") " pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.488188 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b3dc785-5f55-49ca-8678-5105ba7e0568-operator-scripts\") pod \"barbican-70c1-account-create-update-gwzzv\" (UID: \"2b3dc785-5f55-49ca-8678-5105ba7e0568\") " pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.488233 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lv2x\" (UniqueName: \"kubernetes.io/projected/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-kube-api-access-6lv2x\") pod \"cinder-bd38-account-create-update-rgmr5\" (UID: \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\") " pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.488261 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8fng\" (UniqueName: \"kubernetes.io/projected/61eedb40-ed14-42aa-9751-8bedcd699260-kube-api-access-r8fng\") pod \"barbican-db-create-vvrp4\" (UID: \"61eedb40-ed14-42aa-9751-8bedcd699260\") " pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.488916 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-operator-scripts\") pod \"cinder-bd38-account-create-update-rgmr5\" (UID: \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\") " pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.504467 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8whvl"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.505801 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.511056 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.511508 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5qcd" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.511583 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.511883 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.538052 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lv2x\" (UniqueName: \"kubernetes.io/projected/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-kube-api-access-6lv2x\") pod \"cinder-bd38-account-create-update-rgmr5\" (UID: \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\") " pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.543401 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-4501-account-create-update-hj72z"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.545073 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.563160 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.563843 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8whvl"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.574359 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4501-account-create-update-hj72z"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.589822 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxrkh\" (UniqueName: \"kubernetes.io/projected/6c9729b7-e21b-4509-b337-618094fb2d52-kube-api-access-gxrkh\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.590141 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtgf\" (UniqueName: \"kubernetes.io/projected/2b3dc785-5f55-49ca-8678-5105ba7e0568-kube-api-access-lhtgf\") pod \"barbican-70c1-account-create-update-gwzzv\" (UID: \"2b3dc785-5f55-49ca-8678-5105ba7e0568\") " pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.590191 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-combined-ca-bundle\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.590385 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-config-data\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.590457 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61eedb40-ed14-42aa-9751-8bedcd699260-operator-scripts\") pod \"barbican-db-create-vvrp4\" (UID: \"61eedb40-ed14-42aa-9751-8bedcd699260\") " pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.590491 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b3dc785-5f55-49ca-8678-5105ba7e0568-operator-scripts\") pod \"barbican-70c1-account-create-update-gwzzv\" (UID: \"2b3dc785-5f55-49ca-8678-5105ba7e0568\") " pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.590666 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8fng\" (UniqueName: \"kubernetes.io/projected/61eedb40-ed14-42aa-9751-8bedcd699260-kube-api-access-r8fng\") pod \"barbican-db-create-vvrp4\" (UID: \"61eedb40-ed14-42aa-9751-8bedcd699260\") " pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.598005 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61eedb40-ed14-42aa-9751-8bedcd699260-operator-scripts\") pod \"barbican-db-create-vvrp4\" (UID: \"61eedb40-ed14-42aa-9751-8bedcd699260\") " pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.598963 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b3dc785-5f55-49ca-8678-5105ba7e0568-operator-scripts\") pod \"barbican-70c1-account-create-update-gwzzv\" (UID: \"2b3dc785-5f55-49ca-8678-5105ba7e0568\") " pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.627803 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtgf\" (UniqueName: \"kubernetes.io/projected/2b3dc785-5f55-49ca-8678-5105ba7e0568-kube-api-access-lhtgf\") pod \"barbican-70c1-account-create-update-gwzzv\" (UID: \"2b3dc785-5f55-49ca-8678-5105ba7e0568\") " pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.663068 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8fng\" (UniqueName: \"kubernetes.io/projected/61eedb40-ed14-42aa-9751-8bedcd699260-kube-api-access-r8fng\") pod \"barbican-db-create-vvrp4\" (UID: \"61eedb40-ed14-42aa-9751-8bedcd699260\") " pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.671537 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mj8rv"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.672722 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mj8rv"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.674746 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.687089 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.692428 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-config-data\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.692562 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxrkh\" (UniqueName: \"kubernetes.io/projected/6c9729b7-e21b-4509-b337-618094fb2d52-kube-api-access-gxrkh\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.692605 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg89r\" (UniqueName: \"kubernetes.io/projected/95df3f15-8d1d-4baf-bbb6-df4939f0d201-kube-api-access-rg89r\") pod \"heat-4501-account-create-update-hj72z\" (UID: \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\") " pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.692645 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-combined-ca-bundle\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.692684 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95df3f15-8d1d-4baf-bbb6-df4939f0d201-operator-scripts\") pod \"heat-4501-account-create-update-hj72z\" (UID: \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\") " pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.722138 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.757206 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.776319 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-combined-ca-bundle\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.777310 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-config-data\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.782395 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxrkh\" (UniqueName: \"kubernetes.io/projected/6c9729b7-e21b-4509-b337-618094fb2d52-kube-api-access-gxrkh\") pod \"keystone-db-sync-8whvl\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.794670 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg89r\" (UniqueName: \"kubernetes.io/projected/95df3f15-8d1d-4baf-bbb6-df4939f0d201-kube-api-access-rg89r\") pod \"heat-4501-account-create-update-hj72z\" (UID: \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\") " pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.794778 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34bb765-0998-45ea-bb61-9fbbc2c7359d-operator-scripts\") pod \"neutron-db-create-mj8rv\" (UID: \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\") " pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.794864 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95df3f15-8d1d-4baf-bbb6-df4939f0d201-operator-scripts\") pod \"heat-4501-account-create-update-hj72z\" (UID: \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\") " pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.794962 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9f7t\" (UniqueName: \"kubernetes.io/projected/f34bb765-0998-45ea-bb61-9fbbc2c7359d-kube-api-access-r9f7t\") pod \"neutron-db-create-mj8rv\" (UID: \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\") " pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.795866 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95df3f15-8d1d-4baf-bbb6-df4939f0d201-operator-scripts\") pod \"heat-4501-account-create-update-hj72z\" (UID: \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\") " pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.805530 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e433-account-create-update-qm5sx"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.806874 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.810830 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.811210 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg89r\" (UniqueName: \"kubernetes.io/projected/95df3f15-8d1d-4baf-bbb6-df4939f0d201-kube-api-access-rg89r\") pod \"heat-4501-account-create-update-hj72z\" (UID: \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\") " pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.827514 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e433-account-create-update-qm5sx"] Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.897501 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34bb765-0998-45ea-bb61-9fbbc2c7359d-operator-scripts\") pod \"neutron-db-create-mj8rv\" (UID: \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\") " pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.897550 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kv5p\" (UniqueName: \"kubernetes.io/projected/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-kube-api-access-7kv5p\") pod \"neutron-e433-account-create-update-qm5sx\" (UID: \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\") " pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.897641 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9f7t\" (UniqueName: \"kubernetes.io/projected/f34bb765-0998-45ea-bb61-9fbbc2c7359d-kube-api-access-r9f7t\") pod \"neutron-db-create-mj8rv\" (UID: \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\") " pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.897663 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-operator-scripts\") pod \"neutron-e433-account-create-update-qm5sx\" (UID: \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\") " pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.898604 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34bb765-0998-45ea-bb61-9fbbc2c7359d-operator-scripts\") pod \"neutron-db-create-mj8rv\" (UID: \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\") " pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.903693 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8whvl" Jan 29 17:05:36 crc kubenswrapper[4886]: I0129 17:05:36.918226 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9f7t\" (UniqueName: \"kubernetes.io/projected/f34bb765-0998-45ea-bb61-9fbbc2c7359d-kube-api-access-r9f7t\") pod \"neutron-db-create-mj8rv\" (UID: \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\") " pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:36.999741 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kv5p\" (UniqueName: \"kubernetes.io/projected/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-kube-api-access-7kv5p\") pod \"neutron-e433-account-create-update-qm5sx\" (UID: \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\") " pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:36.999882 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-operator-scripts\") pod \"neutron-e433-account-create-update-qm5sx\" (UID: \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\") " pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.000846 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-operator-scripts\") pod \"neutron-e433-account-create-update-qm5sx\" (UID: \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\") " pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.028344 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.049090 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.071555 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kv5p\" (UniqueName: \"kubernetes.io/projected/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-kube-api-access-7kv5p\") pod \"neutron-e433-account-create-update-qm5sx\" (UID: \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\") " pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.157708 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.385230 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-b8qfq"] Jan 29 17:05:37 crc kubenswrapper[4886]: W0129 17:05:37.432602 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod219e979e_b3a8_42d0_8f23_737a86a2aefb.slice/crio-4406b94675c6c7ae9446195f8dfab310f4fa8a3adf586cc31ec4c425aaec53ea WatchSource:0}: Error finding container 4406b94675c6c7ae9446195f8dfab310f4fa8a3adf586cc31ec4c425aaec53ea: Status 404 returned error can't find the container with id 4406b94675c6c7ae9446195f8dfab310f4fa8a3adf586cc31ec4c425aaec53ea Jan 29 17:05:37 crc kubenswrapper[4886]: E0129 17:05:37.455140 4886 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.174:39510->38.129.56.174:35269: write tcp 38.129.56.174:39510->38.129.56.174:35269: write: broken pipe Jan 29 17:05:37 crc kubenswrapper[4886]: W0129 17:05:37.821396 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeca25333_29b2_4c38_9e85_ebd2a0d593d6.slice/crio-cd4898dfd3366424ff76daf2236da5aa1109f2d2ee7053756e696c5c71f74315 WatchSource:0}: Error finding container cd4898dfd3366424ff76daf2236da5aa1109f2d2ee7053756e696c5c71f74315: Status 404 returned error can't find the container with id cd4898dfd3366424ff76daf2236da5aa1109f2d2ee7053756e696c5c71f74315 Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.821846 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5m27f"] Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.866155 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vvrp4"] Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.907355 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bd38-account-create-update-rgmr5"] Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.935362 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-70c1-account-create-update-gwzzv"] Jan 29 17:05:37 crc kubenswrapper[4886]: W0129 17:05:37.941066 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc31fe7aa_0ad1_44ef_a748_b4f366a4d374.slice/crio-5d2dfc86002d797af59c9cb682ec219bf20ee62338a9f69385af929e1e8a81cc WatchSource:0}: Error finding container 5d2dfc86002d797af59c9cb682ec219bf20ee62338a9f69385af929e1e8a81cc: Status 404 returned error can't find the container with id 5d2dfc86002d797af59c9cb682ec219bf20ee62338a9f69385af929e1e8a81cc Jan 29 17:05:37 crc kubenswrapper[4886]: W0129 17:05:37.942208 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b3dc785_5f55_49ca_8678_5105ba7e0568.slice/crio-723376c3c9f49ffb2963a000b3bd3332b032ec0a620314db2f5d4affe87fe53d WatchSource:0}: Error finding container 723376c3c9f49ffb2963a000b3bd3332b032ec0a620314db2f5d4affe87fe53d: Status 404 returned error can't find the container with id 723376c3c9f49ffb2963a000b3bd3332b032ec0a620314db2f5d4affe87fe53d Jan 29 17:05:37 crc kubenswrapper[4886]: I0129 17:05:37.954822 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8whvl"] Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.221054 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mj8rv"] Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.240372 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4501-account-create-update-hj72z"] Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.300031 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e433-account-create-update-qm5sx"] Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.344517 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vvrp4" event={"ID":"61eedb40-ed14-42aa-9751-8bedcd699260","Type":"ContainerStarted","Data":"9fec24589ec3e892ddf58d22ea6ebcc076444b7d5a5a5f362446314614208572"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.350543 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bd38-account-create-update-rgmr5" event={"ID":"c31fe7aa-0ad1-44ef-a748-b4f366a4d374","Type":"ContainerStarted","Data":"5d2dfc86002d797af59c9cb682ec219bf20ee62338a9f69385af929e1e8a81cc"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.362659 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8b3a2d6b-4eb5-44a2-837b-cfbe63f07107","Type":"ContainerStarted","Data":"cac4502f21828cb5ae9e53c7348f0195d428843a6c621cdd4045a212fbc7700c"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.368305 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-b8qfq" event={"ID":"219e979e-b3a8-42d0-8f23-737a86a2aefb","Type":"ContainerStarted","Data":"ce7bb70d8d66605a00b65db196f138b8d093db85ba2aba770dcd073411b5b8b4"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.368370 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-b8qfq" event={"ID":"219e979e-b3a8-42d0-8f23-737a86a2aefb","Type":"ContainerStarted","Data":"4406b94675c6c7ae9446195f8dfab310f4fa8a3adf586cc31ec4c425aaec53ea"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.373798 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"810d58f9bf0547af48c65900b9763c368fc3a05bc3a9ac21ac6e368c9e7f38cf"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.375655 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8whvl" event={"ID":"6c9729b7-e21b-4509-b337-618094fb2d52","Type":"ContainerStarted","Data":"5f929b6a33cac9c82c31ed28623b82d784e928ccd3655129beee8b99eab88731"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.386615 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4501-account-create-update-hj72z" event={"ID":"95df3f15-8d1d-4baf-bbb6-df4939f0d201","Type":"ContainerStarted","Data":"e0c4c5770b60c8e587eeeb148d840581349fd237cbedc0ac808c5bcb6eecdacf"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.404835 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=29.404809232 podStartE2EDuration="29.404809232s" podCreationTimestamp="2026-01-29 17:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:38.397395953 +0000 UTC m=+2621.306115235" watchObservedRunningTime="2026-01-29 17:05:38.404809232 +0000 UTC m=+2621.313528504" Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.405923 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5m27f" event={"ID":"eca25333-29b2-4c38-9e85-ebd2a0d593d6","Type":"ContainerStarted","Data":"cd4898dfd3366424ff76daf2236da5aa1109f2d2ee7053756e696c5c71f74315"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.413756 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mj8rv" event={"ID":"f34bb765-0998-45ea-bb61-9fbbc2c7359d","Type":"ContainerStarted","Data":"72783bbbfa79040fb4dc3f351898bfde9b1e9857733a1a00ee4d73ce0d7d9e05"} Jan 29 17:05:38 crc kubenswrapper[4886]: I0129 17:05:38.423156 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-70c1-account-create-update-gwzzv" event={"ID":"2b3dc785-5f55-49ca-8678-5105ba7e0568","Type":"ContainerStarted","Data":"723376c3c9f49ffb2963a000b3bd3332b032ec0a620314db2f5d4affe87fe53d"} Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.408577 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.409532 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.434967 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4501-account-create-update-hj72z" event={"ID":"95df3f15-8d1d-4baf-bbb6-df4939f0d201","Type":"ContainerStarted","Data":"05a52ecdbf485c6c724d9a992c69aca83958ea1704df0dac8409ddf6fbc7b4d1"} Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.437055 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bd38-account-create-update-rgmr5" event={"ID":"c31fe7aa-0ad1-44ef-a748-b4f366a4d374","Type":"ContainerStarted","Data":"1b2a63dcfed7450a36197cbdc154c29e365ef6be50e63a79bd321d9e35afd21f"} Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.439286 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vvrp4" event={"ID":"61eedb40-ed14-42aa-9751-8bedcd699260","Type":"ContainerStarted","Data":"9211a739518fb120e2bda32757d910dcbc67d03a2ddbfea02f5bc9964d2f0a2d"} Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.441318 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e433-account-create-update-qm5sx" event={"ID":"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff","Type":"ContainerStarted","Data":"c6fd592bb372f4bd56073a5709a8ef40ff848343cbd26b66d1e162d12eab6737"} Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.441387 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e433-account-create-update-qm5sx" event={"ID":"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff","Type":"ContainerStarted","Data":"dda352e99ae8511daf9d45b3e13077ccd37a0c2ef1768700d23fc09ac829a3b5"} Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.444427 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5m27f" event={"ID":"eca25333-29b2-4c38-9e85-ebd2a0d593d6","Type":"ContainerStarted","Data":"c217cd04d2dba654b23c94e4b5b9acb5912a4546fafe4781e26a2d0d53058004"} Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.446622 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-70c1-account-create-update-gwzzv" event={"ID":"2b3dc785-5f55-49ca-8678-5105ba7e0568","Type":"ContainerStarted","Data":"e61c63ed7fdb0d740a758c779dfae1d17126672ffa65adff6cc5cd29f6bcc51c"} Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.473060 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-b8qfq" podStartSLOduration=4.47304162 podStartE2EDuration="4.47304162s" podCreationTimestamp="2026-01-29 17:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:39.46277419 +0000 UTC m=+2622.371493492" watchObservedRunningTime="2026-01-29 17:05:39.47304162 +0000 UTC m=+2622.381760892" Jan 29 17:05:39 crc kubenswrapper[4886]: I0129 17:05:39.549381 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.467373 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"756f69fd1c861029e4c8a391947b2f55ba605273bdd0554f8bef49cbf66dc04d"} Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.484653 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mj8rv" event={"ID":"f34bb765-0998-45ea-bb61-9fbbc2c7359d","Type":"ContainerStarted","Data":"78746abbdca4d80f0a57707d5af0310c508403ee469b611bd3861cf01570354a"} Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.498781 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-4501-account-create-update-hj72z" podStartSLOduration=4.498760079 podStartE2EDuration="4.498760079s" podCreationTimestamp="2026-01-29 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:40.497627217 +0000 UTC m=+2623.406346489" watchObservedRunningTime="2026-01-29 17:05:40.498760079 +0000 UTC m=+2623.407479351" Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.499990 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.535557 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-mj8rv" podStartSLOduration=4.535533485 podStartE2EDuration="4.535533485s" podCreationTimestamp="2026-01-29 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:40.515534891 +0000 UTC m=+2623.424254173" watchObservedRunningTime="2026-01-29 17:05:40.535533485 +0000 UTC m=+2623.444252757" Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.540758 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-e433-account-create-update-qm5sx" podStartSLOduration=4.540748111 podStartE2EDuration="4.540748111s" podCreationTimestamp="2026-01-29 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:40.53644736 +0000 UTC m=+2623.445166642" watchObservedRunningTime="2026-01-29 17:05:40.540748111 +0000 UTC m=+2623.449467383" Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.566057 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-70c1-account-create-update-gwzzv" podStartSLOduration=4.566031234 podStartE2EDuration="4.566031234s" podCreationTimestamp="2026-01-29 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:40.549183249 +0000 UTC m=+2623.457902521" watchObservedRunningTime="2026-01-29 17:05:40.566031234 +0000 UTC m=+2623.474750506" Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.582917 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-bd38-account-create-update-rgmr5" podStartSLOduration=4.582894449 podStartE2EDuration="4.582894449s" podCreationTimestamp="2026-01-29 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:40.577350902 +0000 UTC m=+2623.486070174" watchObservedRunningTime="2026-01-29 17:05:40.582894449 +0000 UTC m=+2623.491613721" Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.597479 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-vvrp4" podStartSLOduration=4.597458519 podStartE2EDuration="4.597458519s" podCreationTimestamp="2026-01-29 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:40.591707577 +0000 UTC m=+2623.500426869" watchObservedRunningTime="2026-01-29 17:05:40.597458519 +0000 UTC m=+2623.506177791" Jan 29 17:05:40 crc kubenswrapper[4886]: I0129 17:05:40.630579 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-5m27f" podStartSLOduration=4.630542961 podStartE2EDuration="4.630542961s" podCreationTimestamp="2026-01-29 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:40.606784901 +0000 UTC m=+2623.515504173" watchObservedRunningTime="2026-01-29 17:05:40.630542961 +0000 UTC m=+2623.539262233" Jan 29 17:05:41 crc kubenswrapper[4886]: I0129 17:05:41.500019 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"1bd7c804046c935666c5c31215dfb2339d74de5eb7be720b59ecc3c3a7162026"} Jan 29 17:05:41 crc kubenswrapper[4886]: I0129 17:05:41.500654 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"508700665a3990bb13f200c3b8750ea2e16465f0fcff9c608e221b69f0ace0f8"} Jan 29 17:05:44 crc kubenswrapper[4886]: I0129 17:05:44.548352 4886 generic.go:334] "Generic (PLEG): container finished" podID="219e979e-b3a8-42d0-8f23-737a86a2aefb" containerID="ce7bb70d8d66605a00b65db196f138b8d093db85ba2aba770dcd073411b5b8b4" exitCode=0 Jan 29 17:05:44 crc kubenswrapper[4886]: I0129 17:05:44.548442 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-b8qfq" event={"ID":"219e979e-b3a8-42d0-8f23-737a86a2aefb","Type":"ContainerDied","Data":"ce7bb70d8d66605a00b65db196f138b8d093db85ba2aba770dcd073411b5b8b4"} Jan 29 17:05:44 crc kubenswrapper[4886]: I0129 17:05:44.555447 4886 generic.go:334] "Generic (PLEG): container finished" podID="61eedb40-ed14-42aa-9751-8bedcd699260" containerID="9211a739518fb120e2bda32757d910dcbc67d03a2ddbfea02f5bc9964d2f0a2d" exitCode=0 Jan 29 17:05:44 crc kubenswrapper[4886]: I0129 17:05:44.555513 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vvrp4" event={"ID":"61eedb40-ed14-42aa-9751-8bedcd699260","Type":"ContainerDied","Data":"9211a739518fb120e2bda32757d910dcbc67d03a2ddbfea02f5bc9964d2f0a2d"} Jan 29 17:05:44 crc kubenswrapper[4886]: I0129 17:05:44.557517 4886 generic.go:334] "Generic (PLEG): container finished" podID="eca25333-29b2-4c38-9e85-ebd2a0d593d6" containerID="c217cd04d2dba654b23c94e4b5b9acb5912a4546fafe4781e26a2d0d53058004" exitCode=0 Jan 29 17:05:44 crc kubenswrapper[4886]: I0129 17:05:44.557556 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5m27f" event={"ID":"eca25333-29b2-4c38-9e85-ebd2a0d593d6","Type":"ContainerDied","Data":"c217cd04d2dba654b23c94e4b5b9acb5912a4546fafe4781e26a2d0d53058004"} Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.218971 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.225519 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.234586 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.347797 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrmtg\" (UniqueName: \"kubernetes.io/projected/219e979e-b3a8-42d0-8f23-737a86a2aefb-kube-api-access-qrmtg\") pod \"219e979e-b3a8-42d0-8f23-737a86a2aefb\" (UID: \"219e979e-b3a8-42d0-8f23-737a86a2aefb\") " Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.347905 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca25333-29b2-4c38-9e85-ebd2a0d593d6-operator-scripts\") pod \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\" (UID: \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\") " Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.347949 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61eedb40-ed14-42aa-9751-8bedcd699260-operator-scripts\") pod \"61eedb40-ed14-42aa-9751-8bedcd699260\" (UID: \"61eedb40-ed14-42aa-9751-8bedcd699260\") " Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.348107 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8fng\" (UniqueName: \"kubernetes.io/projected/61eedb40-ed14-42aa-9751-8bedcd699260-kube-api-access-r8fng\") pod \"61eedb40-ed14-42aa-9751-8bedcd699260\" (UID: \"61eedb40-ed14-42aa-9751-8bedcd699260\") " Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.348223 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8zm4\" (UniqueName: \"kubernetes.io/projected/eca25333-29b2-4c38-9e85-ebd2a0d593d6-kube-api-access-c8zm4\") pod \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\" (UID: \"eca25333-29b2-4c38-9e85-ebd2a0d593d6\") " Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.348270 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219e979e-b3a8-42d0-8f23-737a86a2aefb-operator-scripts\") pod \"219e979e-b3a8-42d0-8f23-737a86a2aefb\" (UID: \"219e979e-b3a8-42d0-8f23-737a86a2aefb\") " Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.348907 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219e979e-b3a8-42d0-8f23-737a86a2aefb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "219e979e-b3a8-42d0-8f23-737a86a2aefb" (UID: "219e979e-b3a8-42d0-8f23-737a86a2aefb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.348925 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61eedb40-ed14-42aa-9751-8bedcd699260-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61eedb40-ed14-42aa-9751-8bedcd699260" (UID: "61eedb40-ed14-42aa-9751-8bedcd699260"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.348962 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca25333-29b2-4c38-9e85-ebd2a0d593d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eca25333-29b2-4c38-9e85-ebd2a0d593d6" (UID: "eca25333-29b2-4c38-9e85-ebd2a0d593d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.349232 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca25333-29b2-4c38-9e85-ebd2a0d593d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.349258 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61eedb40-ed14-42aa-9751-8bedcd699260-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.349268 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219e979e-b3a8-42d0-8f23-737a86a2aefb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.354025 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca25333-29b2-4c38-9e85-ebd2a0d593d6-kube-api-access-c8zm4" (OuterVolumeSpecName: "kube-api-access-c8zm4") pod "eca25333-29b2-4c38-9e85-ebd2a0d593d6" (UID: "eca25333-29b2-4c38-9e85-ebd2a0d593d6"). InnerVolumeSpecName "kube-api-access-c8zm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.354432 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219e979e-b3a8-42d0-8f23-737a86a2aefb-kube-api-access-qrmtg" (OuterVolumeSpecName: "kube-api-access-qrmtg") pod "219e979e-b3a8-42d0-8f23-737a86a2aefb" (UID: "219e979e-b3a8-42d0-8f23-737a86a2aefb"). InnerVolumeSpecName "kube-api-access-qrmtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.354539 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61eedb40-ed14-42aa-9751-8bedcd699260-kube-api-access-r8fng" (OuterVolumeSpecName: "kube-api-access-r8fng") pod "61eedb40-ed14-42aa-9751-8bedcd699260" (UID: "61eedb40-ed14-42aa-9751-8bedcd699260"). InnerVolumeSpecName "kube-api-access-r8fng". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.451662 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8fng\" (UniqueName: \"kubernetes.io/projected/61eedb40-ed14-42aa-9751-8bedcd699260-kube-api-access-r8fng\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.451694 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8zm4\" (UniqueName: \"kubernetes.io/projected/eca25333-29b2-4c38-9e85-ebd2a0d593d6-kube-api-access-c8zm4\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.451704 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrmtg\" (UniqueName: \"kubernetes.io/projected/219e979e-b3a8-42d0-8f23-737a86a2aefb-kube-api-access-qrmtg\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.587370 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-b8qfq" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.587997 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-b8qfq" event={"ID":"219e979e-b3a8-42d0-8f23-737a86a2aefb","Type":"ContainerDied","Data":"4406b94675c6c7ae9446195f8dfab310f4fa8a3adf586cc31ec4c425aaec53ea"} Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.588149 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4406b94675c6c7ae9446195f8dfab310f4fa8a3adf586cc31ec4c425aaec53ea" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.592303 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vvrp4" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.592301 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vvrp4" event={"ID":"61eedb40-ed14-42aa-9751-8bedcd699260","Type":"ContainerDied","Data":"9fec24589ec3e892ddf58d22ea6ebcc076444b7d5a5a5f362446314614208572"} Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.592464 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fec24589ec3e892ddf58d22ea6ebcc076444b7d5a5a5f362446314614208572" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.599565 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5m27f" event={"ID":"eca25333-29b2-4c38-9e85-ebd2a0d593d6","Type":"ContainerDied","Data":"cd4898dfd3366424ff76daf2236da5aa1109f2d2ee7053756e696c5c71f74315"} Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.599610 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd4898dfd3366424ff76daf2236da5aa1109f2d2ee7053756e696c5c71f74315" Jan 29 17:05:46 crc kubenswrapper[4886]: I0129 17:05:46.599679 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5m27f" Jan 29 17:05:47 crc kubenswrapper[4886]: I0129 17:05:47.616127 4886 generic.go:334] "Generic (PLEG): container finished" podID="f34bb765-0998-45ea-bb61-9fbbc2c7359d" containerID="78746abbdca4d80f0a57707d5af0310c508403ee469b611bd3861cf01570354a" exitCode=0 Jan 29 17:05:47 crc kubenswrapper[4886]: I0129 17:05:47.616286 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mj8rv" event={"ID":"f34bb765-0998-45ea-bb61-9fbbc2c7359d","Type":"ContainerDied","Data":"78746abbdca4d80f0a57707d5af0310c508403ee469b611bd3861cf01570354a"} Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.161415 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.209288 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9f7t\" (UniqueName: \"kubernetes.io/projected/f34bb765-0998-45ea-bb61-9fbbc2c7359d-kube-api-access-r9f7t\") pod \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\" (UID: \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\") " Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.209439 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34bb765-0998-45ea-bb61-9fbbc2c7359d-operator-scripts\") pod \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\" (UID: \"f34bb765-0998-45ea-bb61-9fbbc2c7359d\") " Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.210628 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34bb765-0998-45ea-bb61-9fbbc2c7359d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f34bb765-0998-45ea-bb61-9fbbc2c7359d" (UID: "f34bb765-0998-45ea-bb61-9fbbc2c7359d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.214543 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34bb765-0998-45ea-bb61-9fbbc2c7359d-kube-api-access-r9f7t" (OuterVolumeSpecName: "kube-api-access-r9f7t") pod "f34bb765-0998-45ea-bb61-9fbbc2c7359d" (UID: "f34bb765-0998-45ea-bb61-9fbbc2c7359d"). InnerVolumeSpecName "kube-api-access-r9f7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.311367 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9f7t\" (UniqueName: \"kubernetes.io/projected/f34bb765-0998-45ea-bb61-9fbbc2c7359d-kube-api-access-r9f7t\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.311746 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34bb765-0998-45ea-bb61-9fbbc2c7359d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.639558 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mj8rv" event={"ID":"f34bb765-0998-45ea-bb61-9fbbc2c7359d","Type":"ContainerDied","Data":"72783bbbfa79040fb4dc3f351898bfde9b1e9857733a1a00ee4d73ce0d7d9e05"} Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.639602 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72783bbbfa79040fb4dc3f351898bfde9b1e9857733a1a00ee4d73ce0d7d9e05" Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.639707 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mj8rv" Jan 29 17:05:49 crc kubenswrapper[4886]: I0129 17:05:49.641579 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8whvl" event={"ID":"6c9729b7-e21b-4509-b337-618094fb2d52","Type":"ContainerStarted","Data":"c0779e333572b6cd2f4e3dc26dcb63d1cb95b806d59884314b143132c6990518"} Jan 29 17:05:50 crc kubenswrapper[4886]: I0129 17:05:50.658360 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"6126943d7b638f196656287460bc709c85af1650fa60f2f844a7a6f316656604"} Jan 29 17:05:50 crc kubenswrapper[4886]: I0129 17:05:50.682062 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8whvl" podStartSLOduration=4.277903578 podStartE2EDuration="14.681857912s" podCreationTimestamp="2026-01-29 17:05:36 +0000 UTC" firstStartedPulling="2026-01-29 17:05:37.978908746 +0000 UTC m=+2620.887628018" lastFinishedPulling="2026-01-29 17:05:48.38286308 +0000 UTC m=+2631.291582352" observedRunningTime="2026-01-29 17:05:50.677004385 +0000 UTC m=+2633.585723657" watchObservedRunningTime="2026-01-29 17:05:50.681857912 +0000 UTC m=+2633.590577214" Jan 29 17:05:52 crc kubenswrapper[4886]: I0129 17:05:52.696514 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"38ba0cd3468aa429dc897f4bd9147d61f68e0f9426d858466e58c1d619c3733a"} Jan 29 17:05:53 crc kubenswrapper[4886]: I0129 17:05:53.715395 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"3dee29400a52b22f7939257c24506891bbdab5055ef175281c8ec228f41e480c"} Jan 29 17:05:53 crc kubenswrapper[4886]: I0129 17:05:53.715767 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"d364556efd056c1123837a5a65a22f2ed93984242b85061596ededa15610db30"} Jan 29 17:05:53 crc kubenswrapper[4886]: I0129 17:05:53.715783 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"07c28c01a5e885b091f2f4f7ce2f122664a0193a92973254f6fbae68b306e373"} Jan 29 17:05:54 crc kubenswrapper[4886]: I0129 17:05:54.732039 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"fb9bdac361a70b2b4c04db09f01f5ae914f23d75caf457e2a81d51a2bfa4b8da"} Jan 29 17:05:55 crc kubenswrapper[4886]: I0129 17:05:55.747302 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e2f2c6c-bc32-4a32-ba2c-8954d277ce47","Type":"ContainerStarted","Data":"5c8bbde2c57263f7855652fcaace4af5662bbf25ad6eae81ddbdcc492a471484"} Jan 29 17:05:55 crc kubenswrapper[4886]: I0129 17:05:55.787080 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=75.927984867 podStartE2EDuration="1m44.787059653s" podCreationTimestamp="2026-01-29 17:04:11 +0000 UTC" firstStartedPulling="2026-01-29 17:05:19.526879231 +0000 UTC m=+2602.435598503" lastFinishedPulling="2026-01-29 17:05:48.385954007 +0000 UTC m=+2631.294673289" observedRunningTime="2026-01-29 17:05:55.785664694 +0000 UTC m=+2638.694384016" watchObservedRunningTime="2026-01-29 17:05:55.787059653 +0000 UTC m=+2638.695778925" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.114086 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-6r9cj"] Jan 29 17:05:56 crc kubenswrapper[4886]: E0129 17:05:56.115046 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61eedb40-ed14-42aa-9751-8bedcd699260" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.115068 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="61eedb40-ed14-42aa-9751-8bedcd699260" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: E0129 17:05:56.115098 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34bb765-0998-45ea-bb61-9fbbc2c7359d" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.115107 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34bb765-0998-45ea-bb61-9fbbc2c7359d" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: E0129 17:05:56.115133 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219e979e-b3a8-42d0-8f23-737a86a2aefb" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.115141 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="219e979e-b3a8-42d0-8f23-737a86a2aefb" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: E0129 17:05:56.115159 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca25333-29b2-4c38-9e85-ebd2a0d593d6" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.115166 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca25333-29b2-4c38-9e85-ebd2a0d593d6" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.115436 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca25333-29b2-4c38-9e85-ebd2a0d593d6" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.115482 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="61eedb40-ed14-42aa-9751-8bedcd699260" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.115500 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="219e979e-b3a8-42d0-8f23-737a86a2aefb" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.115517 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34bb765-0998-45ea-bb61-9fbbc2c7359d" containerName="mariadb-database-create" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.117178 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.125066 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-6r9cj"] Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.125471 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.268212 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-config\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.268314 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x27r6\" (UniqueName: \"kubernetes.io/projected/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-kube-api-access-x27r6\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.268577 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-svc\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.268737 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.269012 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.269119 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.371346 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-svc\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.371412 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.371511 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.371563 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.371619 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-config\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.371651 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x27r6\" (UniqueName: \"kubernetes.io/projected/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-kube-api-access-x27r6\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.372491 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.372648 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-config\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.372651 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.372714 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-svc\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.373296 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.461545 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x27r6\" (UniqueName: \"kubernetes.io/projected/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-kube-api-access-x27r6\") pod \"dnsmasq-dns-764c5664d7-6r9cj\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.750365 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.759280 4886 generic.go:334] "Generic (PLEG): container finished" podID="c31fe7aa-0ad1-44ef-a748-b4f366a4d374" containerID="1b2a63dcfed7450a36197cbdc154c29e365ef6be50e63a79bd321d9e35afd21f" exitCode=0 Jan 29 17:05:56 crc kubenswrapper[4886]: I0129 17:05:56.759381 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bd38-account-create-update-rgmr5" event={"ID":"c31fe7aa-0ad1-44ef-a748-b4f366a4d374","Type":"ContainerDied","Data":"1b2a63dcfed7450a36197cbdc154c29e365ef6be50e63a79bd321d9e35afd21f"} Jan 29 17:05:57 crc kubenswrapper[4886]: W0129 17:05:57.413566 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ebe69f9_b35b_47a6_976d_bca3b8b8af25.slice/crio-8a5d3dfd30af2f5ac812c053e6d3808dbffd8286368baff784598dc2a9536f00 WatchSource:0}: Error finding container 8a5d3dfd30af2f5ac812c053e6d3808dbffd8286368baff784598dc2a9536f00: Status 404 returned error can't find the container with id 8a5d3dfd30af2f5ac812c053e6d3808dbffd8286368baff784598dc2a9536f00 Jan 29 17:05:57 crc kubenswrapper[4886]: I0129 17:05:57.415982 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-6r9cj"] Jan 29 17:05:57 crc kubenswrapper[4886]: I0129 17:05:57.780246 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" event={"ID":"0ebe69f9-b35b-47a6-976d-bca3b8b8af25","Type":"ContainerStarted","Data":"8a5d3dfd30af2f5ac812c053e6d3808dbffd8286368baff784598dc2a9536f00"} Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.173318 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.314796 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-operator-scripts\") pod \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\" (UID: \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\") " Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.315231 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lv2x\" (UniqueName: \"kubernetes.io/projected/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-kube-api-access-6lv2x\") pod \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\" (UID: \"c31fe7aa-0ad1-44ef-a748-b4f366a4d374\") " Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.315883 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c31fe7aa-0ad1-44ef-a748-b4f366a4d374" (UID: "c31fe7aa-0ad1-44ef-a748-b4f366a4d374"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.321081 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-kube-api-access-6lv2x" (OuterVolumeSpecName: "kube-api-access-6lv2x") pod "c31fe7aa-0ad1-44ef-a748-b4f366a4d374" (UID: "c31fe7aa-0ad1-44ef-a748-b4f366a4d374"). InnerVolumeSpecName "kube-api-access-6lv2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.417764 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.417811 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lv2x\" (UniqueName: \"kubernetes.io/projected/c31fe7aa-0ad1-44ef-a748-b4f366a4d374-kube-api-access-6lv2x\") on node \"crc\" DevicePath \"\"" Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.789863 4886 generic.go:334] "Generic (PLEG): container finished" podID="2b3dc785-5f55-49ca-8678-5105ba7e0568" containerID="e61c63ed7fdb0d740a758c779dfae1d17126672ffa65adff6cc5cd29f6bcc51c" exitCode=0 Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.789963 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-70c1-account-create-update-gwzzv" event={"ID":"2b3dc785-5f55-49ca-8678-5105ba7e0568","Type":"ContainerDied","Data":"e61c63ed7fdb0d740a758c779dfae1d17126672ffa65adff6cc5cd29f6bcc51c"} Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.791793 4886 generic.go:334] "Generic (PLEG): container finished" podID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" containerID="d79e54176b743ae62954d38e473d94b6d45be717a470bbf226985d6f28fe5bd4" exitCode=0 Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.791849 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" event={"ID":"0ebe69f9-b35b-47a6-976d-bca3b8b8af25","Type":"ContainerDied","Data":"d79e54176b743ae62954d38e473d94b6d45be717a470bbf226985d6f28fe5bd4"} Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.793123 4886 generic.go:334] "Generic (PLEG): container finished" podID="95df3f15-8d1d-4baf-bbb6-df4939f0d201" containerID="05a52ecdbf485c6c724d9a992c69aca83958ea1704df0dac8409ddf6fbc7b4d1" exitCode=0 Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.793182 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4501-account-create-update-hj72z" event={"ID":"95df3f15-8d1d-4baf-bbb6-df4939f0d201","Type":"ContainerDied","Data":"05a52ecdbf485c6c724d9a992c69aca83958ea1704df0dac8409ddf6fbc7b4d1"} Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.834666 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bd38-account-create-update-rgmr5" event={"ID":"c31fe7aa-0ad1-44ef-a748-b4f366a4d374","Type":"ContainerDied","Data":"5d2dfc86002d797af59c9cb682ec219bf20ee62338a9f69385af929e1e8a81cc"} Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.834714 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d2dfc86002d797af59c9cb682ec219bf20ee62338a9f69385af929e1e8a81cc" Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.834715 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bd38-account-create-update-rgmr5" Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.840830 4886 generic.go:334] "Generic (PLEG): container finished" podID="b8e697ee-193d-4ce1-9905-cebf2e6ba7ff" containerID="c6fd592bb372f4bd56073a5709a8ef40ff848343cbd26b66d1e162d12eab6737" exitCode=0 Jan 29 17:05:58 crc kubenswrapper[4886]: I0129 17:05:58.840883 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e433-account-create-update-qm5sx" event={"ID":"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff","Type":"ContainerDied","Data":"c6fd592bb372f4bd56073a5709a8ef40ff848343cbd26b66d1e162d12eab6737"} Jan 29 17:05:59 crc kubenswrapper[4886]: I0129 17:05:59.853777 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" event={"ID":"0ebe69f9-b35b-47a6-976d-bca3b8b8af25","Type":"ContainerStarted","Data":"9d62c141d557ad4f511cc99617ca7914a9fcfe251f2f34d5a37428a245460d8c"} Jan 29 17:05:59 crc kubenswrapper[4886]: I0129 17:05:59.856255 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:05:59 crc kubenswrapper[4886]: I0129 17:05:59.878989 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" podStartSLOduration=3.878969594 podStartE2EDuration="3.878969594s" podCreationTimestamp="2026-01-29 17:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:05:59.873389207 +0000 UTC m=+2642.782108509" watchObservedRunningTime="2026-01-29 17:05:59.878969594 +0000 UTC m=+2642.787688866" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.406831 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.414819 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.423241 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.469284 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kv5p\" (UniqueName: \"kubernetes.io/projected/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-kube-api-access-7kv5p\") pod \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\" (UID: \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\") " Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.469382 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-operator-scripts\") pod \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\" (UID: \"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff\") " Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.469953 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8e697ee-193d-4ce1-9905-cebf2e6ba7ff" (UID: "b8e697ee-193d-4ce1-9905-cebf2e6ba7ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.474526 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-kube-api-access-7kv5p" (OuterVolumeSpecName: "kube-api-access-7kv5p") pod "b8e697ee-193d-4ce1-9905-cebf2e6ba7ff" (UID: "b8e697ee-193d-4ce1-9905-cebf2e6ba7ff"). InnerVolumeSpecName "kube-api-access-7kv5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.570459 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg89r\" (UniqueName: \"kubernetes.io/projected/95df3f15-8d1d-4baf-bbb6-df4939f0d201-kube-api-access-rg89r\") pod \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\" (UID: \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\") " Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.570593 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95df3f15-8d1d-4baf-bbb6-df4939f0d201-operator-scripts\") pod \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\" (UID: \"95df3f15-8d1d-4baf-bbb6-df4939f0d201\") " Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.570838 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b3dc785-5f55-49ca-8678-5105ba7e0568-operator-scripts\") pod \"2b3dc785-5f55-49ca-8678-5105ba7e0568\" (UID: \"2b3dc785-5f55-49ca-8678-5105ba7e0568\") " Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.570886 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhtgf\" (UniqueName: \"kubernetes.io/projected/2b3dc785-5f55-49ca-8678-5105ba7e0568-kube-api-access-lhtgf\") pod \"2b3dc785-5f55-49ca-8678-5105ba7e0568\" (UID: \"2b3dc785-5f55-49ca-8678-5105ba7e0568\") " Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.571458 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kv5p\" (UniqueName: \"kubernetes.io/projected/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-kube-api-access-7kv5p\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.571482 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.578669 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3dc785-5f55-49ca-8678-5105ba7e0568-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b3dc785-5f55-49ca-8678-5105ba7e0568" (UID: "2b3dc785-5f55-49ca-8678-5105ba7e0568"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.578827 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95df3f15-8d1d-4baf-bbb6-df4939f0d201-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95df3f15-8d1d-4baf-bbb6-df4939f0d201" (UID: "95df3f15-8d1d-4baf-bbb6-df4939f0d201"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.580571 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3dc785-5f55-49ca-8678-5105ba7e0568-kube-api-access-lhtgf" (OuterVolumeSpecName: "kube-api-access-lhtgf") pod "2b3dc785-5f55-49ca-8678-5105ba7e0568" (UID: "2b3dc785-5f55-49ca-8678-5105ba7e0568"). InnerVolumeSpecName "kube-api-access-lhtgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.590649 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95df3f15-8d1d-4baf-bbb6-df4939f0d201-kube-api-access-rg89r" (OuterVolumeSpecName: "kube-api-access-rg89r") pod "95df3f15-8d1d-4baf-bbb6-df4939f0d201" (UID: "95df3f15-8d1d-4baf-bbb6-df4939f0d201"). InnerVolumeSpecName "kube-api-access-rg89r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.673722 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhtgf\" (UniqueName: \"kubernetes.io/projected/2b3dc785-5f55-49ca-8678-5105ba7e0568-kube-api-access-lhtgf\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.673762 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg89r\" (UniqueName: \"kubernetes.io/projected/95df3f15-8d1d-4baf-bbb6-df4939f0d201-kube-api-access-rg89r\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.673771 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95df3f15-8d1d-4baf-bbb6-df4939f0d201-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.673780 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b3dc785-5f55-49ca-8678-5105ba7e0568-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.864426 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4501-account-create-update-hj72z" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.864417 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4501-account-create-update-hj72z" event={"ID":"95df3f15-8d1d-4baf-bbb6-df4939f0d201","Type":"ContainerDied","Data":"e0c4c5770b60c8e587eeeb148d840581349fd237cbedc0ac808c5bcb6eecdacf"} Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.864858 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0c4c5770b60c8e587eeeb148d840581349fd237cbedc0ac808c5bcb6eecdacf" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.866334 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e433-account-create-update-qm5sx" event={"ID":"b8e697ee-193d-4ce1-9905-cebf2e6ba7ff","Type":"ContainerDied","Data":"dda352e99ae8511daf9d45b3e13077ccd37a0c2ef1768700d23fc09ac829a3b5"} Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.866372 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda352e99ae8511daf9d45b3e13077ccd37a0c2ef1768700d23fc09ac829a3b5" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.866430 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e433-account-create-update-qm5sx" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.868402 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-70c1-account-create-update-gwzzv" event={"ID":"2b3dc785-5f55-49ca-8678-5105ba7e0568","Type":"ContainerDied","Data":"723376c3c9f49ffb2963a000b3bd3332b032ec0a620314db2f5d4affe87fe53d"} Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.868456 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723376c3c9f49ffb2963a000b3bd3332b032ec0a620314db2f5d4affe87fe53d" Jan 29 17:06:00 crc kubenswrapper[4886]: I0129 17:06:00.868428 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-70c1-account-create-update-gwzzv" Jan 29 17:06:01 crc kubenswrapper[4886]: I0129 17:06:01.878825 4886 generic.go:334] "Generic (PLEG): container finished" podID="6c9729b7-e21b-4509-b337-618094fb2d52" containerID="c0779e333572b6cd2f4e3dc26dcb63d1cb95b806d59884314b143132c6990518" exitCode=0 Jan 29 17:06:01 crc kubenswrapper[4886]: I0129 17:06:01.878930 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8whvl" event={"ID":"6c9729b7-e21b-4509-b337-618094fb2d52","Type":"ContainerDied","Data":"c0779e333572b6cd2f4e3dc26dcb63d1cb95b806d59884314b143132c6990518"} Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.280074 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8whvl" Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.338616 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-combined-ca-bundle\") pod \"6c9729b7-e21b-4509-b337-618094fb2d52\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.338666 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxrkh\" (UniqueName: \"kubernetes.io/projected/6c9729b7-e21b-4509-b337-618094fb2d52-kube-api-access-gxrkh\") pod \"6c9729b7-e21b-4509-b337-618094fb2d52\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.338759 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-config-data\") pod \"6c9729b7-e21b-4509-b337-618094fb2d52\" (UID: \"6c9729b7-e21b-4509-b337-618094fb2d52\") " Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.346792 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9729b7-e21b-4509-b337-618094fb2d52-kube-api-access-gxrkh" (OuterVolumeSpecName: "kube-api-access-gxrkh") pod "6c9729b7-e21b-4509-b337-618094fb2d52" (UID: "6c9729b7-e21b-4509-b337-618094fb2d52"). InnerVolumeSpecName "kube-api-access-gxrkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.387310 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c9729b7-e21b-4509-b337-618094fb2d52" (UID: "6c9729b7-e21b-4509-b337-618094fb2d52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.420511 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-config-data" (OuterVolumeSpecName: "config-data") pod "6c9729b7-e21b-4509-b337-618094fb2d52" (UID: "6c9729b7-e21b-4509-b337-618094fb2d52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.441000 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.441032 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9729b7-e21b-4509-b337-618094fb2d52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.441044 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxrkh\" (UniqueName: \"kubernetes.io/projected/6c9729b7-e21b-4509-b337-618094fb2d52-kube-api-access-gxrkh\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.922102 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8whvl" event={"ID":"6c9729b7-e21b-4509-b337-618094fb2d52","Type":"ContainerDied","Data":"5f929b6a33cac9c82c31ed28623b82d784e928ccd3655129beee8b99eab88731"} Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.922161 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f929b6a33cac9c82c31ed28623b82d784e928ccd3655129beee8b99eab88731" Jan 29 17:06:03 crc kubenswrapper[4886]: I0129 17:06:03.922626 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8whvl" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.162756 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-6r9cj"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.163027 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" podUID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" containerName="dnsmasq-dns" containerID="cri-o://9d62c141d557ad4f511cc99617ca7914a9fcfe251f2f34d5a37428a245460d8c" gracePeriod=10 Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.165591 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.208800 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b5c9h"] Jan 29 17:06:04 crc kubenswrapper[4886]: E0129 17:06:04.209461 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e697ee-193d-4ce1-9905-cebf2e6ba7ff" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209488 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e697ee-193d-4ce1-9905-cebf2e6ba7ff" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: E0129 17:06:04.209509 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3dc785-5f55-49ca-8678-5105ba7e0568" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209517 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3dc785-5f55-49ca-8678-5105ba7e0568" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: E0129 17:06:04.209564 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9729b7-e21b-4509-b337-618094fb2d52" containerName="keystone-db-sync" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209572 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9729b7-e21b-4509-b337-618094fb2d52" containerName="keystone-db-sync" Jan 29 17:06:04 crc kubenswrapper[4886]: E0129 17:06:04.209587 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31fe7aa-0ad1-44ef-a748-b4f366a4d374" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209595 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31fe7aa-0ad1-44ef-a748-b4f366a4d374" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: E0129 17:06:04.209613 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95df3f15-8d1d-4baf-bbb6-df4939f0d201" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209620 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="95df3f15-8d1d-4baf-bbb6-df4939f0d201" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209899 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c9729b7-e21b-4509-b337-618094fb2d52" containerName="keystone-db-sync" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209926 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31fe7aa-0ad1-44ef-a748-b4f366a4d374" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209943 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="95df3f15-8d1d-4baf-bbb6-df4939f0d201" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209965 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3dc785-5f55-49ca-8678-5105ba7e0568" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.209980 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e697ee-193d-4ce1-9905-cebf2e6ba7ff" containerName="mariadb-account-create-update" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.213123 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.220052 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.221133 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.235435 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.235908 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5qcd" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.236154 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.258398 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b5c9h"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.264196 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fhh\" (UniqueName: \"kubernetes.io/projected/676a9025-a673-4a70-aa9d-ec34c1db17be-kube-api-access-n9fhh\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.264257 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-combined-ca-bundle\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.264409 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-fernet-keys\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.264457 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-config-data\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.264482 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-scripts\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.264829 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-credential-keys\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.289823 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8962p"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.292098 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.308380 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8962p"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.346742 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-6nmwn"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.348408 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.351004 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.351341 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-658st" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.371126 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-fernet-keys\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.371191 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-config-data\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.374594 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-scripts\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.374668 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjk8f\" (UniqueName: \"kubernetes.io/projected/1fca7a19-7db1-4a2e-9f55-d55442cfda87-kube-api-access-kjk8f\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.374746 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-config\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.375134 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-credential-keys\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.375276 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.375377 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.375579 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fhh\" (UniqueName: \"kubernetes.io/projected/676a9025-a673-4a70-aa9d-ec34c1db17be-kube-api-access-n9fhh\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.375620 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-combined-ca-bundle\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.375695 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.375731 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-svc\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.396749 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-scripts\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.396957 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-config-data\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.397752 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-fernet-keys\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.398964 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-combined-ca-bundle\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.408856 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fhh\" (UniqueName: \"kubernetes.io/projected/676a9025-a673-4a70-aa9d-ec34c1db17be-kube-api-access-n9fhh\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.412461 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-credential-keys\") pod \"keystone-bootstrap-b5c9h\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.439028 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-6nmwn"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.492783 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.492845 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.492892 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v7hl\" (UniqueName: \"kubernetes.io/projected/a0058f32-ae80-4dde-9dce-095c62f45979-kube-api-access-9v7hl\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.493066 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.493097 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-svc\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.493114 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-combined-ca-bundle\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.493186 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjk8f\" (UniqueName: \"kubernetes.io/projected/1fca7a19-7db1-4a2e-9f55-d55442cfda87-kube-api-access-kjk8f\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.493224 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-config\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.493256 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-config-data\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.494020 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-svc\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.494141 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.494811 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-config\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.494937 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.495298 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.578911 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.603375 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v7hl\" (UniqueName: \"kubernetes.io/projected/a0058f32-ae80-4dde-9dce-095c62f45979-kube-api-access-9v7hl\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.603508 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-combined-ca-bundle\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.603600 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-config-data\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.624452 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-config-data\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.625252 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjk8f\" (UniqueName: \"kubernetes.io/projected/1fca7a19-7db1-4a2e-9f55-d55442cfda87-kube-api-access-kjk8f\") pod \"dnsmasq-dns-5959f8865f-8962p\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.642269 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-combined-ca-bundle\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.651786 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qglhp"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.653282 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.673804 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.674001 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wvjgr" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.674639 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.690107 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v7hl\" (UniqueName: \"kubernetes.io/projected/a0058f32-ae80-4dde-9dce-095c62f45979-kube-api-access-9v7hl\") pod \"heat-db-sync-6nmwn\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.705383 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qglhp"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.706897 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-config\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.706980 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-combined-ca-bundle\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.707086 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkvgz\" (UniqueName: \"kubernetes.io/projected/43da0665-7e6a-4176-ae84-71128a89a243-kube-api-access-vkvgz\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.748871 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-j5gfz"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.757302 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.760344 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.760400 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.774355 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ldtkt" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.795593 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j5gfz"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.796205 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809014 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04dae116-ceca-4588-9cba-1266bfa92caf-etc-machine-id\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809083 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-combined-ca-bundle\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809113 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rkdq\" (UniqueName: \"kubernetes.io/projected/04dae116-ceca-4588-9cba-1266bfa92caf-kube-api-access-2rkdq\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809171 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-scripts\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809260 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-config-data\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809316 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-config\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809383 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-combined-ca-bundle\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809462 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-db-sync-config-data\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.809495 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkvgz\" (UniqueName: \"kubernetes.io/projected/43da0665-7e6a-4176-ae84-71128a89a243-kube-api-access-vkvgz\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.816393 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6nmwn" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.821197 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-combined-ca-bundle\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.822494 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-config\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.837127 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8m2mm"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.838634 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.842867 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkvgz\" (UniqueName: \"kubernetes.io/projected/43da0665-7e6a-4176-ae84-71128a89a243-kube-api-access-vkvgz\") pod \"neutron-db-sync-qglhp\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.844787 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.845110 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.845239 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mrvvt" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.869370 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-q2dxw"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.887311 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.892942 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8m2mm"] Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.928646 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5k8bj" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.937517 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.943034 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ckms\" (UniqueName: \"kubernetes.io/projected/8923ac96-087a-425b-a8b4-c09aa4be3d78-kube-api-access-8ckms\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.943137 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-db-sync-config-data\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.943176 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86p7n\" (UniqueName: \"kubernetes.io/projected/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-kube-api-access-86p7n\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.943782 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-db-sync-config-data\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944008 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04dae116-ceca-4588-9cba-1266bfa92caf-etc-machine-id\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944048 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-combined-ca-bundle\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944113 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-combined-ca-bundle\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944423 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rkdq\" (UniqueName: \"kubernetes.io/projected/04dae116-ceca-4588-9cba-1266bfa92caf-kube-api-access-2rkdq\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944459 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-combined-ca-bundle\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944528 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-scripts\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944617 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-config-data\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944886 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-scripts\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.944958 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-config-data\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.945191 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8923ac96-087a-425b-a8b4-c09aa4be3d78-logs\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.946750 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04dae116-ceca-4588-9cba-1266bfa92caf-etc-machine-id\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.952203 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-scripts\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.954552 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-config-data\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.962780 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-db-sync-config-data\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.978178 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-combined-ca-bundle\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.981046 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rkdq\" (UniqueName: \"kubernetes.io/projected/04dae116-ceca-4588-9cba-1266bfa92caf-kube-api-access-2rkdq\") pod \"cinder-db-sync-j5gfz\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:04 crc kubenswrapper[4886]: I0129 17:06:04.988707 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q2dxw"] Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.002025 4886 generic.go:334] "Generic (PLEG): container finished" podID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" containerID="9d62c141d557ad4f511cc99617ca7914a9fcfe251f2f34d5a37428a245460d8c" exitCode=0 Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.002071 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" event={"ID":"0ebe69f9-b35b-47a6-976d-bca3b8b8af25","Type":"ContainerDied","Data":"9d62c141d557ad4f511cc99617ca7914a9fcfe251f2f34d5a37428a245460d8c"} Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.062780 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-scripts\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.062854 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-config-data\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.062996 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8923ac96-087a-425b-a8b4-c09aa4be3d78-logs\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.063102 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ckms\" (UniqueName: \"kubernetes.io/projected/8923ac96-087a-425b-a8b4-c09aa4be3d78-kube-api-access-8ckms\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.063155 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86p7n\" (UniqueName: \"kubernetes.io/projected/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-kube-api-access-86p7n\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.063261 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-db-sync-config-data\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.063339 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-combined-ca-bundle\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.063394 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-combined-ca-bundle\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.064472 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8923ac96-087a-425b-a8b4-c09aa4be3d78-logs\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.091516 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-scripts\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.108983 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ckms\" (UniqueName: \"kubernetes.io/projected/8923ac96-087a-425b-a8b4-c09aa4be3d78-kube-api-access-8ckms\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.116487 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-combined-ca-bundle\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.116824 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-db-sync-config-data\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.117545 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-config-data\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.122607 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-combined-ca-bundle\") pod \"placement-db-sync-8m2mm\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.125824 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qglhp" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.131197 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8962p"] Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.141157 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86p7n\" (UniqueName: \"kubernetes.io/projected/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-kube-api-access-86p7n\") pod \"barbican-db-sync-q2dxw\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.147288 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.153149 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5smww"] Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.157499 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.213932 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5smww"] Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.225758 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.237434 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.238115 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8m2mm" Jan 29 17:06:05 crc kubenswrapper[4886]: E0129 17:06:05.249074 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" containerName="dnsmasq-dns" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.249113 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" containerName="dnsmasq-dns" Jan 29 17:06:05 crc kubenswrapper[4886]: E0129 17:06:05.249162 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" containerName="init" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.249169 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" containerName="init" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.249474 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" containerName="dnsmasq-dns" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.251396 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.252075 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.258182 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.258373 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.272402 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.273850 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.274106 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mjr\" (UniqueName: \"kubernetes.io/projected/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-kube-api-access-88mjr\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.274489 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.274735 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.275221 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-config\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.305160 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.380887 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-config\") pod \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.387951 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-swift-storage-0\") pod \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.388054 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-svc\") pod \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.389182 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-sb\") pod \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.389214 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x27r6\" (UniqueName: \"kubernetes.io/projected/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-kube-api-access-x27r6\") pod \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.389340 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-nb\") pod \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\" (UID: \"0ebe69f9-b35b-47a6-976d-bca3b8b8af25\") " Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.389837 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4459b\" (UniqueName: \"kubernetes.io/projected/87986c31-37d7-4624-87a2-b5678e01d865-kube-api-access-4459b\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.389891 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.389997 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.390051 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-run-httpd\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.390123 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.390174 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-config-data\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.390376 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-config\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.390403 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-log-httpd\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.390570 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.390782 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-scripts\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.391019 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.391099 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.391140 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mjr\" (UniqueName: \"kubernetes.io/projected/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-kube-api-access-88mjr\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.391692 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.396050 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-config\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.396830 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.398092 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.401891 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.421535 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-kube-api-access-x27r6" (OuterVolumeSpecName: "kube-api-access-x27r6") pod "0ebe69f9-b35b-47a6-976d-bca3b8b8af25" (UID: "0ebe69f9-b35b-47a6-976d-bca3b8b8af25"). InnerVolumeSpecName "kube-api-access-x27r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.441403 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mjr\" (UniqueName: \"kubernetes.io/projected/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-kube-api-access-88mjr\") pod \"dnsmasq-dns-58dd9ff6bc-5smww\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.486359 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.498304 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.498384 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-scripts\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.498604 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4459b\" (UniqueName: \"kubernetes.io/projected/87986c31-37d7-4624-87a2-b5678e01d865-kube-api-access-4459b\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.498637 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.498717 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-run-httpd\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.498790 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-config-data\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.498869 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-log-httpd\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.503570 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-run-httpd\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.504466 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x27r6\" (UniqueName: \"kubernetes.io/projected/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-kube-api-access-x27r6\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.505066 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-log-httpd\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.514630 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-config-data\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.541006 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.541672 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-scripts\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.542801 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.553820 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b5c9h"] Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.562234 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-config" (OuterVolumeSpecName: "config") pod "0ebe69f9-b35b-47a6-976d-bca3b8b8af25" (UID: "0ebe69f9-b35b-47a6-976d-bca3b8b8af25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.566303 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4459b\" (UniqueName: \"kubernetes.io/projected/87986c31-37d7-4624-87a2-b5678e01d865-kube-api-access-4459b\") pod \"ceilometer-0\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.597798 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ebe69f9-b35b-47a6-976d-bca3b8b8af25" (UID: "0ebe69f9-b35b-47a6-976d-bca3b8b8af25"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.598830 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ebe69f9-b35b-47a6-976d-bca3b8b8af25" (UID: "0ebe69f9-b35b-47a6-976d-bca3b8b8af25"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.606448 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0ebe69f9-b35b-47a6-976d-bca3b8b8af25" (UID: "0ebe69f9-b35b-47a6-976d-bca3b8b8af25"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.614407 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.615094 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.615106 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.615117 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.620804 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ebe69f9-b35b-47a6-976d-bca3b8b8af25" (UID: "0ebe69f9-b35b-47a6-976d-bca3b8b8af25"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.663121 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:06:05 crc kubenswrapper[4886]: I0129 17:06:05.727427 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ebe69f9-b35b-47a6-976d-bca3b8b8af25-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.029369 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" event={"ID":"0ebe69f9-b35b-47a6-976d-bca3b8b8af25","Type":"ContainerDied","Data":"8a5d3dfd30af2f5ac812c053e6d3808dbffd8286368baff784598dc2a9536f00"} Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.029772 4886 scope.go:117] "RemoveContainer" containerID="9d62c141d557ad4f511cc99617ca7914a9fcfe251f2f34d5a37428a245460d8c" Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.029695 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-6r9cj" Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.052523 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b5c9h" event={"ID":"676a9025-a673-4a70-aa9d-ec34c1db17be","Type":"ContainerStarted","Data":"24f822770ac33b496012b10bfe803c315a5cfcfd68498769b1825800fd0da253"} Jan 29 17:06:06 crc kubenswrapper[4886]: W0129 17:06:06.054868 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fca7a19_7db1_4a2e_9f55_d55442cfda87.slice/crio-f0564cff8924c07ae14fa9bcf81d675ce573496e02b6b78a9d4ba5771735575e WatchSource:0}: Error finding container f0564cff8924c07ae14fa9bcf81d675ce573496e02b6b78a9d4ba5771735575e: Status 404 returned error can't find the container with id f0564cff8924c07ae14fa9bcf81d675ce573496e02b6b78a9d4ba5771735575e Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.059168 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8962p"] Jan 29 17:06:06 crc kubenswrapper[4886]: W0129 17:06:06.091037 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0058f32_ae80_4dde_9dce_095c62f45979.slice/crio-d9df74376035a2b4e196d856e8d76469a75a91514ac671f314bd4926926ee2e3 WatchSource:0}: Error finding container d9df74376035a2b4e196d856e8d76469a75a91514ac671f314bd4926926ee2e3: Status 404 returned error can't find the container with id d9df74376035a2b4e196d856e8d76469a75a91514ac671f314bd4926926ee2e3 Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.109837 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-6nmwn"] Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.151660 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-6r9cj"] Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.179225 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-6r9cj"] Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.181903 4886 scope.go:117] "RemoveContainer" containerID="d79e54176b743ae62954d38e473d94b6d45be717a470bbf226985d6f28fe5bd4" Jan 29 17:06:06 crc kubenswrapper[4886]: W0129 17:06:06.283291 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43da0665_7e6a_4176_ae84_71128a89a243.slice/crio-466198a6dbe8073f38dde3862e5bfda50e204a4fc5dd98f6c616c1e63cc8d1a0 WatchSource:0}: Error finding container 466198a6dbe8073f38dde3862e5bfda50e204a4fc5dd98f6c616c1e63cc8d1a0: Status 404 returned error can't find the container with id 466198a6dbe8073f38dde3862e5bfda50e204a4fc5dd98f6c616c1e63cc8d1a0 Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.287575 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qglhp"] Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.332485 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j5gfz"] Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.553013 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8m2mm"] Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.638025 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebe69f9-b35b-47a6-976d-bca3b8b8af25" path="/var/lib/kubelet/pods/0ebe69f9-b35b-47a6-976d-bca3b8b8af25/volumes" Jan 29 17:06:06 crc kubenswrapper[4886]: I0129 17:06:06.943481 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.040293 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q2dxw"] Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.052952 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5smww"] Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.074711 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.104817 4886 generic.go:334] "Generic (PLEG): container finished" podID="1fca7a19-7db1-4a2e-9f55-d55442cfda87" containerID="850b39de005465a0ca176b82210b0557b234cba9ae1cd5ffefbe61ffc7abab5e" exitCode=0 Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.104997 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-8962p" event={"ID":"1fca7a19-7db1-4a2e-9f55-d55442cfda87","Type":"ContainerDied","Data":"850b39de005465a0ca176b82210b0557b234cba9ae1cd5ffefbe61ffc7abab5e"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.105076 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-8962p" event={"ID":"1fca7a19-7db1-4a2e-9f55-d55442cfda87","Type":"ContainerStarted","Data":"f0564cff8924c07ae14fa9bcf81d675ce573496e02b6b78a9d4ba5771735575e"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.108853 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" event={"ID":"2c74b25a-0daf-4c7e-a023-a7082d8d73cf","Type":"ContainerStarted","Data":"02d41ab973396ad0b9067fb7d12dd022b4232ab3e2460c195caa3ce7c6f4e250"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.115669 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qglhp" event={"ID":"43da0665-7e6a-4176-ae84-71128a89a243","Type":"ContainerStarted","Data":"c4ce1f7996acaa4140e3f499ede2bc0c80a3f2eb7c1df999e0b4f5903e1d75cf"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.115718 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qglhp" event={"ID":"43da0665-7e6a-4176-ae84-71128a89a243","Type":"ContainerStarted","Data":"466198a6dbe8073f38dde3862e5bfda50e204a4fc5dd98f6c616c1e63cc8d1a0"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.134512 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6nmwn" event={"ID":"a0058f32-ae80-4dde-9dce-095c62f45979","Type":"ContainerStarted","Data":"d9df74376035a2b4e196d856e8d76469a75a91514ac671f314bd4926926ee2e3"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.159647 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qglhp" podStartSLOduration=3.159624139 podStartE2EDuration="3.159624139s" podCreationTimestamp="2026-01-29 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:06:07.157410897 +0000 UTC m=+2650.066130189" watchObservedRunningTime="2026-01-29 17:06:07.159624139 +0000 UTC m=+2650.068343421" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.171451 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b5c9h" event={"ID":"676a9025-a673-4a70-aa9d-ec34c1db17be","Type":"ContainerStarted","Data":"9b68510df598b451ff2d4faad4a0af1636831487ecf72ad66ce874c635cd8d9e"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.188175 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8m2mm" event={"ID":"8923ac96-087a-425b-a8b4-c09aa4be3d78","Type":"ContainerStarted","Data":"7ba3dd51612ec84b7435debfb27c88330b100c1320a10e3e0bea0e482e076cd8"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.197108 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j5gfz" event={"ID":"04dae116-ceca-4588-9cba-1266bfa92caf","Type":"ContainerStarted","Data":"3d72bfc601ef7f8aa44a162e8a49bc717daf618d327e886ac546527a7c3a7e17"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.199251 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q2dxw" event={"ID":"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd","Type":"ContainerStarted","Data":"474a2d0d1c07609e70e6ff2d358c4e7ec5598344e910e4e2e3ec3d713255b48d"} Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.223148 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b5c9h" podStartSLOduration=3.223125538 podStartE2EDuration="3.223125538s" podCreationTimestamp="2026-01-29 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:06:07.191658671 +0000 UTC m=+2650.100377963" watchObservedRunningTime="2026-01-29 17:06:07.223125538 +0000 UTC m=+2650.131844810" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.709054 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.721537 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-config\") pod \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.721606 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-nb\") pod \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.721632 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-svc\") pod \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.721650 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-sb\") pod \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.756751 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fca7a19-7db1-4a2e-9f55-d55442cfda87" (UID: "1fca7a19-7db1-4a2e-9f55-d55442cfda87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.759387 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fca7a19-7db1-4a2e-9f55-d55442cfda87" (UID: "1fca7a19-7db1-4a2e-9f55-d55442cfda87"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.766026 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fca7a19-7db1-4a2e-9f55-d55442cfda87" (UID: "1fca7a19-7db1-4a2e-9f55-d55442cfda87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.800900 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-config" (OuterVolumeSpecName: "config") pod "1fca7a19-7db1-4a2e-9f55-d55442cfda87" (UID: "1fca7a19-7db1-4a2e-9f55-d55442cfda87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.823577 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-swift-storage-0\") pod \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.823695 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjk8f\" (UniqueName: \"kubernetes.io/projected/1fca7a19-7db1-4a2e-9f55-d55442cfda87-kube-api-access-kjk8f\") pod \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\" (UID: \"1fca7a19-7db1-4a2e-9f55-d55442cfda87\") " Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.824261 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.824297 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.824312 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.824336 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.827040 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fca7a19-7db1-4a2e-9f55-d55442cfda87-kube-api-access-kjk8f" (OuterVolumeSpecName: "kube-api-access-kjk8f") pod "1fca7a19-7db1-4a2e-9f55-d55442cfda87" (UID: "1fca7a19-7db1-4a2e-9f55-d55442cfda87"). InnerVolumeSpecName "kube-api-access-kjk8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.855851 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fca7a19-7db1-4a2e-9f55-d55442cfda87" (UID: "1fca7a19-7db1-4a2e-9f55-d55442cfda87"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.928496 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fca7a19-7db1-4a2e-9f55-d55442cfda87-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:07 crc kubenswrapper[4886]: I0129 17:06:07.928541 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjk8f\" (UniqueName: \"kubernetes.io/projected/1fca7a19-7db1-4a2e-9f55-d55442cfda87-kube-api-access-kjk8f\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.218571 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-8962p" event={"ID":"1fca7a19-7db1-4a2e-9f55-d55442cfda87","Type":"ContainerDied","Data":"f0564cff8924c07ae14fa9bcf81d675ce573496e02b6b78a9d4ba5771735575e"} Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.218622 4886 scope.go:117] "RemoveContainer" containerID="850b39de005465a0ca176b82210b0557b234cba9ae1cd5ffefbe61ffc7abab5e" Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.219136 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-8962p" Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.228655 4886 generic.go:334] "Generic (PLEG): container finished" podID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerID="cfb7fc79ff5a728a120650052bd3ff240e06f929a54c3a2f5efc1ad8f2dd226b" exitCode=0 Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.228702 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" event={"ID":"2c74b25a-0daf-4c7e-a023-a7082d8d73cf","Type":"ContainerDied","Data":"cfb7fc79ff5a728a120650052bd3ff240e06f929a54c3a2f5efc1ad8f2dd226b"} Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.241786 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerStarted","Data":"3e6ce925c7e7561fcefff1c9869e186415899419d2d1d24db82a0097aea34d23"} Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.316169 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8962p"] Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.326620 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8962p"] Jan 29 17:06:08 crc kubenswrapper[4886]: I0129 17:06:08.654076 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fca7a19-7db1-4a2e-9f55-d55442cfda87" path="/var/lib/kubelet/pods/1fca7a19-7db1-4a2e-9f55-d55442cfda87/volumes" Jan 29 17:06:09 crc kubenswrapper[4886]: I0129 17:06:09.254844 4886 generic.go:334] "Generic (PLEG): container finished" podID="9f114908-5594-4378-939f-f54b2157d676" containerID="76e9fd9551f88713599d793f819bec47fc38185510d47fbd152e0939943ac037" exitCode=0 Jan 29 17:06:09 crc kubenswrapper[4886]: I0129 17:06:09.254898 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-thqn5" event={"ID":"9f114908-5594-4378-939f-f54b2157d676","Type":"ContainerDied","Data":"76e9fd9551f88713599d793f819bec47fc38185510d47fbd152e0939943ac037"} Jan 29 17:06:09 crc kubenswrapper[4886]: I0129 17:06:09.258497 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" event={"ID":"2c74b25a-0daf-4c7e-a023-a7082d8d73cf","Type":"ContainerStarted","Data":"2b118af4cda69e6639958e45442cbb3e2fb4932b299bff9387ea1c20cb9f4e45"} Jan 29 17:06:09 crc kubenswrapper[4886]: I0129 17:06:09.258932 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:09 crc kubenswrapper[4886]: I0129 17:06:09.334398 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podStartSLOduration=5.334370352 podStartE2EDuration="5.334370352s" podCreationTimestamp="2026-01-29 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:06:09.322106416 +0000 UTC m=+2652.230825698" watchObservedRunningTime="2026-01-29 17:06:09.334370352 +0000 UTC m=+2652.243089624" Jan 29 17:06:11 crc kubenswrapper[4886]: I0129 17:06:11.280005 4886 generic.go:334] "Generic (PLEG): container finished" podID="676a9025-a673-4a70-aa9d-ec34c1db17be" containerID="9b68510df598b451ff2d4faad4a0af1636831487ecf72ad66ce874c635cd8d9e" exitCode=0 Jan 29 17:06:11 crc kubenswrapper[4886]: I0129 17:06:11.280066 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b5c9h" event={"ID":"676a9025-a673-4a70-aa9d-ec34c1db17be","Type":"ContainerDied","Data":"9b68510df598b451ff2d4faad4a0af1636831487ecf72ad66ce874c635cd8d9e"} Jan 29 17:06:12 crc kubenswrapper[4886]: I0129 17:06:12.995267 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.003523 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-thqn5" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165295 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9fhh\" (UniqueName: \"kubernetes.io/projected/676a9025-a673-4a70-aa9d-ec34c1db17be-kube-api-access-n9fhh\") pod \"676a9025-a673-4a70-aa9d-ec34c1db17be\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165418 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-combined-ca-bundle\") pod \"9f114908-5594-4378-939f-f54b2157d676\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165470 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-config-data\") pod \"9f114908-5594-4378-939f-f54b2157d676\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165553 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-credential-keys\") pod \"676a9025-a673-4a70-aa9d-ec34c1db17be\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165669 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-fernet-keys\") pod \"676a9025-a673-4a70-aa9d-ec34c1db17be\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165706 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-scripts\") pod \"676a9025-a673-4a70-aa9d-ec34c1db17be\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165736 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c7r8\" (UniqueName: \"kubernetes.io/projected/9f114908-5594-4378-939f-f54b2157d676-kube-api-access-6c7r8\") pod \"9f114908-5594-4378-939f-f54b2157d676\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165757 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-combined-ca-bundle\") pod \"676a9025-a673-4a70-aa9d-ec34c1db17be\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165807 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-db-sync-config-data\") pod \"9f114908-5594-4378-939f-f54b2157d676\" (UID: \"9f114908-5594-4378-939f-f54b2157d676\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.165880 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-config-data\") pod \"676a9025-a673-4a70-aa9d-ec34c1db17be\" (UID: \"676a9025-a673-4a70-aa9d-ec34c1db17be\") " Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.171833 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f114908-5594-4378-939f-f54b2157d676-kube-api-access-6c7r8" (OuterVolumeSpecName: "kube-api-access-6c7r8") pod "9f114908-5594-4378-939f-f54b2157d676" (UID: "9f114908-5594-4378-939f-f54b2157d676"). InnerVolumeSpecName "kube-api-access-6c7r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.173449 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "676a9025-a673-4a70-aa9d-ec34c1db17be" (UID: "676a9025-a673-4a70-aa9d-ec34c1db17be"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.175270 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9f114908-5594-4378-939f-f54b2157d676" (UID: "9f114908-5594-4378-939f-f54b2157d676"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.176490 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "676a9025-a673-4a70-aa9d-ec34c1db17be" (UID: "676a9025-a673-4a70-aa9d-ec34c1db17be"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.180174 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/676a9025-a673-4a70-aa9d-ec34c1db17be-kube-api-access-n9fhh" (OuterVolumeSpecName: "kube-api-access-n9fhh") pod "676a9025-a673-4a70-aa9d-ec34c1db17be" (UID: "676a9025-a673-4a70-aa9d-ec34c1db17be"). InnerVolumeSpecName "kube-api-access-n9fhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.181441 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-scripts" (OuterVolumeSpecName: "scripts") pod "676a9025-a673-4a70-aa9d-ec34c1db17be" (UID: "676a9025-a673-4a70-aa9d-ec34c1db17be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.205673 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "676a9025-a673-4a70-aa9d-ec34c1db17be" (UID: "676a9025-a673-4a70-aa9d-ec34c1db17be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.208722 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-config-data" (OuterVolumeSpecName: "config-data") pod "676a9025-a673-4a70-aa9d-ec34c1db17be" (UID: "676a9025-a673-4a70-aa9d-ec34c1db17be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.209064 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f114908-5594-4378-939f-f54b2157d676" (UID: "9f114908-5594-4378-939f-f54b2157d676"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.250702 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-config-data" (OuterVolumeSpecName: "config-data") pod "9f114908-5594-4378-939f-f54b2157d676" (UID: "9f114908-5594-4378-939f-f54b2157d676"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268031 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268104 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268116 4886 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268124 4886 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268172 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268181 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c7r8\" (UniqueName: \"kubernetes.io/projected/9f114908-5594-4378-939f-f54b2157d676-kube-api-access-6c7r8\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268193 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268201 4886 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f114908-5594-4378-939f-f54b2157d676-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268210 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676a9025-a673-4a70-aa9d-ec34c1db17be-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.268238 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9fhh\" (UniqueName: \"kubernetes.io/projected/676a9025-a673-4a70-aa9d-ec34c1db17be-kube-api-access-n9fhh\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.301645 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-thqn5" event={"ID":"9f114908-5594-4378-939f-f54b2157d676","Type":"ContainerDied","Data":"fcc8bbf40553cde9c2b386443b55115feca44b41f5cbd715334aa7b1506eef78"} Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.301679 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc8bbf40553cde9c2b386443b55115feca44b41f5cbd715334aa7b1506eef78" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.301753 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-thqn5" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.309974 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b5c9h" event={"ID":"676a9025-a673-4a70-aa9d-ec34c1db17be","Type":"ContainerDied","Data":"24f822770ac33b496012b10bfe803c315a5cfcfd68498769b1825800fd0da253"} Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.310029 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24f822770ac33b496012b10bfe803c315a5cfcfd68498769b1825800fd0da253" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.310071 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b5c9h" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.372966 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b5c9h"] Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.385812 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b5c9h"] Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.478129 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-p924n"] Jan 29 17:06:13 crc kubenswrapper[4886]: E0129 17:06:13.478910 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f114908-5594-4378-939f-f54b2157d676" containerName="glance-db-sync" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.478950 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f114908-5594-4378-939f-f54b2157d676" containerName="glance-db-sync" Jan 29 17:06:13 crc kubenswrapper[4886]: E0129 17:06:13.478971 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="676a9025-a673-4a70-aa9d-ec34c1db17be" containerName="keystone-bootstrap" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.478979 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="676a9025-a673-4a70-aa9d-ec34c1db17be" containerName="keystone-bootstrap" Jan 29 17:06:13 crc kubenswrapper[4886]: E0129 17:06:13.478994 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fca7a19-7db1-4a2e-9f55-d55442cfda87" containerName="init" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.479003 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fca7a19-7db1-4a2e-9f55-d55442cfda87" containerName="init" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.479371 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="676a9025-a673-4a70-aa9d-ec34c1db17be" containerName="keystone-bootstrap" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.480149 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fca7a19-7db1-4a2e-9f55-d55442cfda87" containerName="init" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.480184 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f114908-5594-4378-939f-f54b2157d676" containerName="glance-db-sync" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.481225 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.485172 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.485657 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.485669 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5qcd" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.485695 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.487340 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.494759 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p924n"] Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.676511 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-fernet-keys\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.676645 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq47h\" (UniqueName: \"kubernetes.io/projected/68cdc6ed-ce63-43af-8502-b36cc0ae788a-kube-api-access-cq47h\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.676680 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-credential-keys\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.676737 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-config-data\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.676844 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-scripts\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.676880 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-combined-ca-bundle\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.779558 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-scripts\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.779648 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-combined-ca-bundle\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.779780 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-fernet-keys\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.780110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq47h\" (UniqueName: \"kubernetes.io/projected/68cdc6ed-ce63-43af-8502-b36cc0ae788a-kube-api-access-cq47h\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.780170 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-credential-keys\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.780265 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-config-data\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.785016 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-scripts\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.785054 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-combined-ca-bundle\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.785037 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-fernet-keys\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.785174 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-config-data\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.786241 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-credential-keys\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.796734 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq47h\" (UniqueName: \"kubernetes.io/projected/68cdc6ed-ce63-43af-8502-b36cc0ae788a-kube-api-access-cq47h\") pod \"keystone-bootstrap-p924n\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:13 crc kubenswrapper[4886]: I0129 17:06:13.808169 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p924n" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.480769 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5smww"] Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.481279 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" containerID="cri-o://2b118af4cda69e6639958e45442cbb3e2fb4932b299bff9387ea1c20cb9f4e45" gracePeriod=10 Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.482769 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.532587 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-96hn8"] Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.534366 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.552761 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-96hn8"] Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.614729 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.614789 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-config\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.614870 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.614986 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.615014 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bwl\" (UniqueName: \"kubernetes.io/projected/80d171a6-11ab-4cdf-b469-acb56ff11735-kube-api-access-t8bwl\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.615072 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.633213 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="676a9025-a673-4a70-aa9d-ec34c1db17be" path="/var/lib/kubelet/pods/676a9025-a673-4a70-aa9d-ec34c1db17be/volumes" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.716422 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.716473 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bwl\" (UniqueName: \"kubernetes.io/projected/80d171a6-11ab-4cdf-b469-acb56ff11735-kube-api-access-t8bwl\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.716541 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.716667 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.716701 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-config\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.716751 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.717698 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.717730 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.717785 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.718096 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.718478 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-config\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.737450 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bwl\" (UniqueName: \"kubernetes.io/projected/80d171a6-11ab-4cdf-b469-acb56ff11735-kube-api-access-t8bwl\") pod \"dnsmasq-dns-785d8bcb8c-96hn8\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:14 crc kubenswrapper[4886]: I0129 17:06:14.868768 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.338711 4886 generic.go:334] "Generic (PLEG): container finished" podID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerID="2b118af4cda69e6639958e45442cbb3e2fb4932b299bff9387ea1c20cb9f4e45" exitCode=0 Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.338765 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" event={"ID":"2c74b25a-0daf-4c7e-a023-a7082d8d73cf","Type":"ContainerDied","Data":"2b118af4cda69e6639958e45442cbb3e2fb4932b299bff9387ea1c20cb9f4e45"} Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.394181 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.397684 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.406311 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.409854 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cpfdg" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.432188 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.465537 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.487437 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: connect: connection refused" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.537221 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.537283 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-648q7\" (UniqueName: \"kubernetes.io/projected/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-kube-api-access-648q7\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.537364 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-scripts\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.537395 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-config-data\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.537418 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.537480 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.537563 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-logs\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.622449 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.624650 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.628885 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.639141 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-scripts\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.639180 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-config-data\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.639205 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.639245 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.639315 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-logs\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.639438 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.639463 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-648q7\" (UniqueName: \"kubernetes.io/projected/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-kube-api-access-648q7\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.640528 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-logs\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.640540 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.645170 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.645232 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9fc1bf04f61733e1543e4c6d32069c38c610c3d0fa9a349fa6a409f3542d3c50/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.645299 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.646984 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-scripts\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.647918 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-config-data\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.649978 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.674025 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-648q7\" (UniqueName: \"kubernetes.io/projected/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-kube-api-access-648q7\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.695384 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.731755 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.741436 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4f7\" (UniqueName: \"kubernetes.io/projected/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-kube-api-access-xs4f7\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.741589 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.741612 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.741668 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.741754 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.741845 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.741911 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.843895 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.843938 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.843982 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.844048 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.844113 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.844150 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.844200 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4f7\" (UniqueName: \"kubernetes.io/projected/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-kube-api-access-xs4f7\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.845003 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.845094 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.848127 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.848361 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b71ee9dc20b2cd8e0489051d74fcf4864cc02a892819f8a5785e080087446e/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.849204 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.858218 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.861136 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.866259 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4f7\" (UniqueName: \"kubernetes.io/projected/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-kube-api-access-xs4f7\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.938495 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:15 crc kubenswrapper[4886]: I0129 17:06:15.946868 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:06:16 crc kubenswrapper[4886]: I0129 17:06:16.995070 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:17 crc kubenswrapper[4886]: I0129 17:06:17.069950 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:20 crc kubenswrapper[4886]: I0129 17:06:20.487213 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: connect: connection refused" Jan 29 17:06:25 crc kubenswrapper[4886]: I0129 17:06:25.487124 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: connect: connection refused" Jan 29 17:06:25 crc kubenswrapper[4886]: I0129 17:06:25.488635 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:35 crc kubenswrapper[4886]: I0129 17:06:35.488619 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: i/o timeout" Jan 29 17:06:40 crc kubenswrapper[4886]: I0129 17:06:40.490386 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: i/o timeout" Jan 29 17:06:45 crc kubenswrapper[4886]: I0129 17:06:45.491596 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: i/o timeout" Jan 29 17:06:46 crc kubenswrapper[4886]: E0129 17:06:46.350073 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Jan 29 17:06:46 crc kubenswrapper[4886]: E0129 17:06:46.350545 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9v7hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-6nmwn_openstack(a0058f32-ae80-4dde-9dce-095c62f45979): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:06:46 crc kubenswrapper[4886]: E0129 17:06:46.351769 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-6nmwn" podUID="a0058f32-ae80-4dde-9dce-095c62f45979" Jan 29 17:06:46 crc kubenswrapper[4886]: E0129 17:06:46.673634 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-6nmwn" podUID="a0058f32-ae80-4dde-9dce-095c62f45979" Jan 29 17:06:46 crc kubenswrapper[4886]: E0129 17:06:46.883197 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 29 17:06:46 crc kubenswrapper[4886]: E0129 17:06:46.883478 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-86p7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-q2dxw_openstack(ffb099fb-7bdb-4969-b3cb-6fc4ef498afd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:06:46 crc kubenswrapper[4886]: E0129 17:06:46.884820 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-q2dxw" podUID="ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.107991 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.212277 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88mjr\" (UniqueName: \"kubernetes.io/projected/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-kube-api-access-88mjr\") pod \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.212512 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-swift-storage-0\") pod \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.212591 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-config\") pod \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.212665 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-sb\") pod \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.212736 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-nb\") pod \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.212809 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-svc\") pod \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\" (UID: \"2c74b25a-0daf-4c7e-a023-a7082d8d73cf\") " Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.216774 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-kube-api-access-88mjr" (OuterVolumeSpecName: "kube-api-access-88mjr") pod "2c74b25a-0daf-4c7e-a023-a7082d8d73cf" (UID: "2c74b25a-0daf-4c7e-a023-a7082d8d73cf"). InnerVolumeSpecName "kube-api-access-88mjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.272666 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-config" (OuterVolumeSpecName: "config") pod "2c74b25a-0daf-4c7e-a023-a7082d8d73cf" (UID: "2c74b25a-0daf-4c7e-a023-a7082d8d73cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.275616 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c74b25a-0daf-4c7e-a023-a7082d8d73cf" (UID: "2c74b25a-0daf-4c7e-a023-a7082d8d73cf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.283128 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c74b25a-0daf-4c7e-a023-a7082d8d73cf" (UID: "2c74b25a-0daf-4c7e-a023-a7082d8d73cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.284159 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c74b25a-0daf-4c7e-a023-a7082d8d73cf" (UID: "2c74b25a-0daf-4c7e-a023-a7082d8d73cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.288299 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c74b25a-0daf-4c7e-a023-a7082d8d73cf" (UID: "2c74b25a-0daf-4c7e-a023-a7082d8d73cf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.316413 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.316487 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.316502 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.316516 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.316530 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.316656 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88mjr\" (UniqueName: \"kubernetes.io/projected/2c74b25a-0daf-4c7e-a023-a7082d8d73cf-kube-api-access-88mjr\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.686087 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" event={"ID":"2c74b25a-0daf-4c7e-a023-a7082d8d73cf","Type":"ContainerDied","Data":"02d41ab973396ad0b9067fb7d12dd022b4232ab3e2460c195caa3ce7c6f4e250"} Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.686460 4886 scope.go:117] "RemoveContainer" containerID="2b118af4cda69e6639958e45442cbb3e2fb4932b299bff9387ea1c20cb9f4e45" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.686150 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" Jan 29 17:06:47 crc kubenswrapper[4886]: E0129 17:06:47.689647 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-q2dxw" podUID="ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.734225 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5smww"] Jan 29 17:06:47 crc kubenswrapper[4886]: I0129 17:06:47.743856 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5smww"] Jan 29 17:06:48 crc kubenswrapper[4886]: E0129 17:06:48.613514 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 29 17:06:48 crc kubenswrapper[4886]: E0129 17:06:48.613880 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2rkdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-j5gfz_openstack(04dae116-ceca-4588-9cba-1266bfa92caf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 17:06:48 crc kubenswrapper[4886]: E0129 17:06:48.616730 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-j5gfz" podUID="04dae116-ceca-4588-9cba-1266bfa92caf" Jan 29 17:06:48 crc kubenswrapper[4886]: I0129 17:06:48.628879 4886 scope.go:117] "RemoveContainer" containerID="cfb7fc79ff5a728a120650052bd3ff240e06f929a54c3a2f5efc1ad8f2dd226b" Jan 29 17:06:48 crc kubenswrapper[4886]: I0129 17:06:48.635149 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" path="/var/lib/kubelet/pods/2c74b25a-0daf-4c7e-a023-a7082d8d73cf/volumes" Jan 29 17:06:48 crc kubenswrapper[4886]: E0129 17:06:48.733315 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-j5gfz" podUID="04dae116-ceca-4588-9cba-1266bfa92caf" Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.202937 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p924n"] Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.276494 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.350315 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-96hn8"] Jan 29 17:06:49 crc kubenswrapper[4886]: W0129 17:06:49.370815 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80d171a6_11ab_4cdf_b469_acb56ff11735.slice/crio-81bf0e642c0dbb7fd724006f0c2c518606f7b43d2584453df92bcfe55b829357 WatchSource:0}: Error finding container 81bf0e642c0dbb7fd724006f0c2c518606f7b43d2584453df92bcfe55b829357: Status 404 returned error can't find the container with id 81bf0e642c0dbb7fd724006f0c2c518606f7b43d2584453df92bcfe55b829357 Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.428088 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:49 crc kubenswrapper[4886]: W0129 17:06:49.434781 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod426bc8f7_73fc_4b57_acd0_7fd8cc26b8a5.slice/crio-d5621121b70db635809d6807b77222d4ab1e04f02615d9fa23d98fc438df1164 WatchSource:0}: Error finding container d5621121b70db635809d6807b77222d4ab1e04f02615d9fa23d98fc438df1164: Status 404 returned error can't find the container with id d5621121b70db635809d6807b77222d4ab1e04f02615d9fa23d98fc438df1164 Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.737437 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f2a90939-bcbf-44d8-8ebe-7ab1d118b360","Type":"ContainerStarted","Data":"a06b5cce5a1745b2439afd2d0c3ff6b9f761ea3f97b4ad1a67abe7ae84d84767"} Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.739953 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p924n" event={"ID":"68cdc6ed-ce63-43af-8502-b36cc0ae788a","Type":"ContainerStarted","Data":"6375ad3e949f813db64562de4e61fa2910abcb717d2e211c509e5dbcb6b07f3a"} Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.739979 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p924n" event={"ID":"68cdc6ed-ce63-43af-8502-b36cc0ae788a","Type":"ContainerStarted","Data":"76b68b08b92b70f0de4c1a2319c04176b3479b075a2ab3366608b1fce7ae76ee"} Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.741599 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5","Type":"ContainerStarted","Data":"d5621121b70db635809d6807b77222d4ab1e04f02615d9fa23d98fc438df1164"} Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.743221 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerStarted","Data":"6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4"} Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.745209 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8m2mm" event={"ID":"8923ac96-087a-425b-a8b4-c09aa4be3d78","Type":"ContainerStarted","Data":"b56f617415d312996740dc4a8697ef643e749e77f4339179492aab6c12f2f0d4"} Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.753386 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" event={"ID":"80d171a6-11ab-4cdf-b469-acb56ff11735","Type":"ContainerStarted","Data":"26aa10c89bd28f4d17b03fabdd3c3dd7d4b1ab633d533650ee03163b7c656cd5"} Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.753432 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" event={"ID":"80d171a6-11ab-4cdf-b469-acb56ff11735","Type":"ContainerStarted","Data":"81bf0e642c0dbb7fd724006f0c2c518606f7b43d2584453df92bcfe55b829357"} Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.763013 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-p924n" podStartSLOduration=36.762965829 podStartE2EDuration="36.762965829s" podCreationTimestamp="2026-01-29 17:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:06:49.76052606 +0000 UTC m=+2692.669245332" watchObservedRunningTime="2026-01-29 17:06:49.762965829 +0000 UTC m=+2692.671685111" Jan 29 17:06:49 crc kubenswrapper[4886]: I0129 17:06:49.827727 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8m2mm" podStartSLOduration=5.439114842 podStartE2EDuration="45.827705862s" podCreationTimestamp="2026-01-29 17:06:04 +0000 UTC" firstStartedPulling="2026-01-29 17:06:06.563731085 +0000 UTC m=+2649.472450347" lastFinishedPulling="2026-01-29 17:06:46.952322095 +0000 UTC m=+2689.861041367" observedRunningTime="2026-01-29 17:06:49.804968982 +0000 UTC m=+2692.713688254" watchObservedRunningTime="2026-01-29 17:06:49.827705862 +0000 UTC m=+2692.736425134" Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.492660 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-5smww" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: i/o timeout" Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.823734 4886 generic.go:334] "Generic (PLEG): container finished" podID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerID="26aa10c89bd28f4d17b03fabdd3c3dd7d4b1ab633d533650ee03163b7c656cd5" exitCode=0 Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.824108 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" event={"ID":"80d171a6-11ab-4cdf-b469-acb56ff11735","Type":"ContainerDied","Data":"26aa10c89bd28f4d17b03fabdd3c3dd7d4b1ab633d533650ee03163b7c656cd5"} Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.824859 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.824929 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" event={"ID":"80d171a6-11ab-4cdf-b469-acb56ff11735","Type":"ContainerStarted","Data":"705da8d91cb45e05b6aa5ab5b116ce8252bf3f498078113a7eee5edc1d206bca"} Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.834739 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f2a90939-bcbf-44d8-8ebe-7ab1d118b360","Type":"ContainerStarted","Data":"33ad2a1126eff6cbb88ccc77df323fa1e654c5d2155c0985168da0fd53e1864a"} Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.834898 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerName="glance-log" containerID="cri-o://33ad2a1126eff6cbb88ccc77df323fa1e654c5d2155c0985168da0fd53e1864a" gracePeriod=30 Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.835128 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerName="glance-httpd" containerID="cri-o://95a7d3b8a9e32ae8ae2e3ef610040f7131916bc7de34db8cc1af0fec9c3ef960" gracePeriod=30 Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.841061 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5","Type":"ContainerStarted","Data":"fb8fc548f591be6e16630c1c9171e7ca1c4549f03107635ab3d54cf848daec39"} Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.861904 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" podStartSLOduration=36.861888171 podStartE2EDuration="36.861888171s" podCreationTimestamp="2026-01-29 17:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:06:50.857981721 +0000 UTC m=+2693.766701013" watchObservedRunningTime="2026-01-29 17:06:50.861888171 +0000 UTC m=+2693.770607443" Jan 29 17:06:50 crc kubenswrapper[4886]: I0129 17:06:50.892624 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=36.892594516 podStartE2EDuration="36.892594516s" podCreationTimestamp="2026-01-29 17:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:06:50.879644341 +0000 UTC m=+2693.788363643" watchObservedRunningTime="2026-01-29 17:06:50.892594516 +0000 UTC m=+2693.801313788" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.867289 4886 generic.go:334] "Generic (PLEG): container finished" podID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerID="95a7d3b8a9e32ae8ae2e3ef610040f7131916bc7de34db8cc1af0fec9c3ef960" exitCode=143 Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.867873 4886 generic.go:334] "Generic (PLEG): container finished" podID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerID="33ad2a1126eff6cbb88ccc77df323fa1e654c5d2155c0985168da0fd53e1864a" exitCode=143 Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.867364 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f2a90939-bcbf-44d8-8ebe-7ab1d118b360","Type":"ContainerDied","Data":"95a7d3b8a9e32ae8ae2e3ef610040f7131916bc7de34db8cc1af0fec9c3ef960"} Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.867911 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f2a90939-bcbf-44d8-8ebe-7ab1d118b360","Type":"ContainerDied","Data":"33ad2a1126eff6cbb88ccc77df323fa1e654c5d2155c0985168da0fd53e1864a"} Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.867923 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f2a90939-bcbf-44d8-8ebe-7ab1d118b360","Type":"ContainerDied","Data":"a06b5cce5a1745b2439afd2d0c3ff6b9f761ea3f97b4ad1a67abe7ae84d84767"} Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.867933 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a06b5cce5a1745b2439afd2d0c3ff6b9f761ea3f97b4ad1a67abe7ae84d84767" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.916914 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.947575 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-combined-ca-bundle\") pod \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.947742 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-config-data\") pod \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.947862 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.947917 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-648q7\" (UniqueName: \"kubernetes.io/projected/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-kube-api-access-648q7\") pod \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.947952 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-logs\") pod \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.948000 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-scripts\") pod \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.948171 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-httpd-run\") pod \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\" (UID: \"f2a90939-bcbf-44d8-8ebe-7ab1d118b360\") " Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.948833 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f2a90939-bcbf-44d8-8ebe-7ab1d118b360" (UID: "f2a90939-bcbf-44d8-8ebe-7ab1d118b360"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.949277 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-logs" (OuterVolumeSpecName: "logs") pod "f2a90939-bcbf-44d8-8ebe-7ab1d118b360" (UID: "f2a90939-bcbf-44d8-8ebe-7ab1d118b360"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.949883 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.949904 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.954922 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-kube-api-access-648q7" (OuterVolumeSpecName: "kube-api-access-648q7") pod "f2a90939-bcbf-44d8-8ebe-7ab1d118b360" (UID: "f2a90939-bcbf-44d8-8ebe-7ab1d118b360"). InnerVolumeSpecName "kube-api-access-648q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.955559 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-scripts" (OuterVolumeSpecName: "scripts") pod "f2a90939-bcbf-44d8-8ebe-7ab1d118b360" (UID: "f2a90939-bcbf-44d8-8ebe-7ab1d118b360"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:51 crc kubenswrapper[4886]: I0129 17:06:51.976368 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1" (OuterVolumeSpecName: "glance") pod "f2a90939-bcbf-44d8-8ebe-7ab1d118b360" (UID: "f2a90939-bcbf-44d8-8ebe-7ab1d118b360"). InnerVolumeSpecName "pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.010173 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2a90939-bcbf-44d8-8ebe-7ab1d118b360" (UID: "f2a90939-bcbf-44d8-8ebe-7ab1d118b360"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.037832 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-config-data" (OuterVolumeSpecName: "config-data") pod "f2a90939-bcbf-44d8-8ebe-7ab1d118b360" (UID: "f2a90939-bcbf-44d8-8ebe-7ab1d118b360"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.051346 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.051376 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.051411 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") on node \"crc\" " Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.051422 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-648q7\" (UniqueName: \"kubernetes.io/projected/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-kube-api-access-648q7\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.051434 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2a90939-bcbf-44d8-8ebe-7ab1d118b360-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.083180 4886 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.083374 4886 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1") on node "crc" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.153191 4886 reconciler_common.go:293] "Volume detached for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.884746 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5","Type":"ContainerStarted","Data":"794f8e0bf261a512c459ecf62c8c7c26bca5d60128a7b4f23734cabe8f7c898d"} Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.884938 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerName="glance-log" containerID="cri-o://fb8fc548f591be6e16630c1c9171e7ca1c4549f03107635ab3d54cf848daec39" gracePeriod=30 Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.885026 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerName="glance-httpd" containerID="cri-o://794f8e0bf261a512c459ecf62c8c7c26bca5d60128a7b4f23734cabe8f7c898d" gracePeriod=30 Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.891151 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerStarted","Data":"fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927"} Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.891194 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.923987 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=38.9239639 podStartE2EDuration="38.9239639s" podCreationTimestamp="2026-01-29 17:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:06:52.91684983 +0000 UTC m=+2695.825569112" watchObservedRunningTime="2026-01-29 17:06:52.9239639 +0000 UTC m=+2695.832683172" Jan 29 17:06:52 crc kubenswrapper[4886]: I0129 17:06:52.950305 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.982114 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.991396 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:53 crc kubenswrapper[4886]: E0129 17:06:52.991992 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.992007 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" Jan 29 17:06:53 crc kubenswrapper[4886]: E0129 17:06:52.992022 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerName="glance-log" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.992031 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerName="glance-log" Jan 29 17:06:53 crc kubenswrapper[4886]: E0129 17:06:52.992043 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerName="glance-httpd" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.992050 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerName="glance-httpd" Jan 29 17:06:53 crc kubenswrapper[4886]: E0129 17:06:52.992065 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="init" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.992072 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="init" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.992284 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerName="glance-log" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.992321 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" containerName="glance-httpd" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.992350 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c74b25a-0daf-4c7e-a023-a7082d8d73cf" containerName="dnsmasq-dns" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.993657 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:52.999693 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.024565 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.026180 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.191111 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-config-data\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.191166 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-logs\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.191201 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.191475 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fglvx\" (UniqueName: \"kubernetes.io/projected/849de0d3-3456-44c2-bef4-3a435e4a432a-kube-api-access-fglvx\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.191526 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-scripts\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.191827 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.191942 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.191983 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.293779 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-config-data\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.293845 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-logs\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.293899 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.294119 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fglvx\" (UniqueName: \"kubernetes.io/projected/849de0d3-3456-44c2-bef4-3a435e4a432a-kube-api-access-fglvx\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.294203 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-scripts\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.294342 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-logs\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.294507 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.294612 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.294660 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.294770 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.296798 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.296853 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9fc1bf04f61733e1543e4c6d32069c38c610c3d0fa9a349fa6a409f3542d3c50/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.299452 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-scripts\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.301933 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.305002 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.305828 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-config-data\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.316866 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fglvx\" (UniqueName: \"kubernetes.io/projected/849de0d3-3456-44c2-bef4-3a435e4a432a-kube-api-access-fglvx\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.346477 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:06:53 crc kubenswrapper[4886]: I0129 17:06:53.438880 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:53.905126 4886 generic.go:334] "Generic (PLEG): container finished" podID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerID="794f8e0bf261a512c459ecf62c8c7c26bca5d60128a7b4f23734cabe8f7c898d" exitCode=143 Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:53.905471 4886 generic.go:334] "Generic (PLEG): container finished" podID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerID="fb8fc548f591be6e16630c1c9171e7ca1c4549f03107635ab3d54cf848daec39" exitCode=143 Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:53.905178 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5","Type":"ContainerDied","Data":"794f8e0bf261a512c459ecf62c8c7c26bca5d60128a7b4f23734cabe8f7c898d"} Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:53.905520 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5","Type":"ContainerDied","Data":"fb8fc548f591be6e16630c1c9171e7ca1c4549f03107635ab3d54cf848daec39"} Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:53.905535 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5","Type":"ContainerDied","Data":"d5621121b70db635809d6807b77222d4ab1e04f02615d9fa23d98fc438df1164"} Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:53.905548 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5621121b70db635809d6807b77222d4ab1e04f02615d9fa23d98fc438df1164" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:53.941504 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.111358 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-logs\") pod \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.111467 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-scripts\") pod \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.111685 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.111727 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs4f7\" (UniqueName: \"kubernetes.io/projected/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-kube-api-access-xs4f7\") pod \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.111807 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-httpd-run\") pod \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.111834 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-combined-ca-bundle\") pod \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.111971 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-config-data\") pod \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\" (UID: \"426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5\") " Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.119728 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-logs" (OuterVolumeSpecName: "logs") pod "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" (UID: "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.119929 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" (UID: "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.130360 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-scripts" (OuterVolumeSpecName: "scripts") pod "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" (UID: "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.130558 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-kube-api-access-xs4f7" (OuterVolumeSpecName: "kube-api-access-xs4f7") pod "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" (UID: "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5"). InnerVolumeSpecName "kube-api-access-xs4f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.196441 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" (UID: "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.215820 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.215845 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.215853 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs4f7\" (UniqueName: \"kubernetes.io/projected/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-kube-api-access-xs4f7\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.215864 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.215872 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.258598 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-config-data" (OuterVolumeSpecName: "config-data") pod "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" (UID: "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.320802 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.376898 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019" (OuterVolumeSpecName: "glance") pod "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" (UID: "426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5"). InnerVolumeSpecName "pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.422462 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") on node \"crc\" " Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.448428 4886 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.448585 4886 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019") on node "crc" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.524743 4886 reconciler_common.go:293] "Volume detached for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") on node \"crc\" DevicePath \"\"" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.632387 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a90939-bcbf-44d8-8ebe-7ab1d118b360" path="/var/lib/kubelet/pods/f2a90939-bcbf-44d8-8ebe-7ab1d118b360/volumes" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.914113 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.960877 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.981004 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.992957 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:54 crc kubenswrapper[4886]: E0129 17:06:54.993539 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerName="glance-httpd" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.993554 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerName="glance-httpd" Jan 29 17:06:54 crc kubenswrapper[4886]: E0129 17:06:54.993587 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerName="glance-log" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.993596 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerName="glance-log" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.993840 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerName="glance-httpd" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.993856 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" containerName="glance-log" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.994956 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.998208 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 29 17:06:54 crc kubenswrapper[4886]: I0129 17:06:54.998444 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.005535 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.145827 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.145866 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.145971 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.145995 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhpzr\" (UniqueName: \"kubernetes.io/projected/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-kube-api-access-fhpzr\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.146016 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.146043 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.146111 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.146305 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.203866 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:06:55 crc kubenswrapper[4886]: W0129 17:06:55.213513 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod849de0d3_3456_44c2_bef4_3a435e4a432a.slice/crio-6c945ea15f303c81064b58dfa01521088d6d511849d81e35019f4fd66c782c28 WatchSource:0}: Error finding container 6c945ea15f303c81064b58dfa01521088d6d511849d81e35019f4fd66c782c28: Status 404 returned error can't find the container with id 6c945ea15f303c81064b58dfa01521088d6d511849d81e35019f4fd66c782c28 Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.248209 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.248261 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhpzr\" (UniqueName: \"kubernetes.io/projected/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-kube-api-access-fhpzr\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.248334 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.248362 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.248430 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.248470 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.248519 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.248537 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.249815 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.252937 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.265753 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.265854 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhpzr\" (UniqueName: \"kubernetes.io/projected/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-kube-api-access-fhpzr\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.265885 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.266063 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b71ee9dc20b2cd8e0489051d74fcf4864cc02a892819f8a5785e080087446e/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.266126 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.266275 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.269928 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.322710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.622395 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:06:55 crc kubenswrapper[4886]: I0129 17:06:55.935086 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"849de0d3-3456-44c2-bef4-3a435e4a432a","Type":"ContainerStarted","Data":"6c945ea15f303c81064b58dfa01521088d6d511849d81e35019f4fd66c782c28"} Jan 29 17:06:56 crc kubenswrapper[4886]: I0129 17:06:56.545700 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:06:56 crc kubenswrapper[4886]: I0129 17:06:56.646511 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5" path="/var/lib/kubelet/pods/426bc8f7-73fc-4b57-acd0-7fd8cc26b8a5/volumes" Jan 29 17:06:56 crc kubenswrapper[4886]: I0129 17:06:56.949640 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf","Type":"ContainerStarted","Data":"71bc8d6cf1178c38541a40863263406b012b61b297b4f5183d44e11e56405a8a"} Jan 29 17:06:58 crc kubenswrapper[4886]: I0129 17:06:58.992161 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"849de0d3-3456-44c2-bef4-3a435e4a432a","Type":"ContainerStarted","Data":"685691dd71892e3462a49d43e961e4398610edbd2ff6858db714971fb73711e6"} Jan 29 17:06:59 crc kubenswrapper[4886]: I0129 17:06:59.870485 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:06:59 crc kubenswrapper[4886]: I0129 17:06:59.962345 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t8rs7"] Jan 29 17:06:59 crc kubenswrapper[4886]: I0129 17:06:59.962586 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-t8rs7" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerName="dnsmasq-dns" containerID="cri-o://54bdeb43a338f0b719b206ca212f50bc02c6d2592ec0ac66c6b8743631a3cf1b" gracePeriod=10 Jan 29 17:07:01 crc kubenswrapper[4886]: I0129 17:07:01.017636 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf","Type":"ContainerStarted","Data":"d46a9e5456f252ab3dd8ef0ca224f83e7f91449851fd433a23e9070eb20e028e"} Jan 29 17:07:01 crc kubenswrapper[4886]: I0129 17:07:01.507279 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t8rs7" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: connect: connection refused" Jan 29 17:07:02 crc kubenswrapper[4886]: I0129 17:07:02.031762 4886 generic.go:334] "Generic (PLEG): container finished" podID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerID="54bdeb43a338f0b719b206ca212f50bc02c6d2592ec0ac66c6b8743631a3cf1b" exitCode=0 Jan 29 17:07:02 crc kubenswrapper[4886]: I0129 17:07:02.032010 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t8rs7" event={"ID":"eb212bbc-3071-4fda-968d-b6d3f19996ee","Type":"ContainerDied","Data":"54bdeb43a338f0b719b206ca212f50bc02c6d2592ec0ac66c6b8743631a3cf1b"} Jan 29 17:07:03 crc kubenswrapper[4886]: I0129 17:07:03.050274 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"849de0d3-3456-44c2-bef4-3a435e4a432a","Type":"ContainerStarted","Data":"5e2f27254ecaeae6872715e18449eaa22b877597c8124da7a49920ec97100c5d"} Jan 29 17:07:04 crc kubenswrapper[4886]: I0129 17:07:04.061643 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf","Type":"ContainerStarted","Data":"819d3c493df902007da456da0899d275e457a2f0ed2e48aedaf84f652820cb61"} Jan 29 17:07:04 crc kubenswrapper[4886]: I0129 17:07:04.098113 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.098092326 podStartE2EDuration="12.098092326s" podCreationTimestamp="2026-01-29 17:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:04.088088954 +0000 UTC m=+2706.996808246" watchObservedRunningTime="2026-01-29 17:07:04.098092326 +0000 UTC m=+2707.006811598" Jan 29 17:07:05 crc kubenswrapper[4886]: I0129 17:07:05.104680 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.104649747 podStartE2EDuration="11.104649747s" podCreationTimestamp="2026-01-29 17:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:05.094750958 +0000 UTC m=+2708.003470260" watchObservedRunningTime="2026-01-29 17:07:05.104649747 +0000 UTC m=+2708.013369039" Jan 29 17:07:05 crc kubenswrapper[4886]: I0129 17:07:05.622800 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 17:07:05 crc kubenswrapper[4886]: I0129 17:07:05.622858 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 17:07:05 crc kubenswrapper[4886]: I0129 17:07:05.926607 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 17:07:05 crc kubenswrapper[4886]: I0129 17:07:05.926770 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 17:07:06 crc kubenswrapper[4886]: I0129 17:07:06.084254 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 17:07:06 crc kubenswrapper[4886]: I0129 17:07:06.084569 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.375171 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.488612 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-nb\") pod \"eb212bbc-3071-4fda-968d-b6d3f19996ee\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.488920 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czcfr\" (UniqueName: \"kubernetes.io/projected/eb212bbc-3071-4fda-968d-b6d3f19996ee-kube-api-access-czcfr\") pod \"eb212bbc-3071-4fda-968d-b6d3f19996ee\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.489121 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-sb\") pod \"eb212bbc-3071-4fda-968d-b6d3f19996ee\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.489226 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-config\") pod \"eb212bbc-3071-4fda-968d-b6d3f19996ee\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.489292 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-dns-svc\") pod \"eb212bbc-3071-4fda-968d-b6d3f19996ee\" (UID: \"eb212bbc-3071-4fda-968d-b6d3f19996ee\") " Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.500567 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb212bbc-3071-4fda-968d-b6d3f19996ee-kube-api-access-czcfr" (OuterVolumeSpecName: "kube-api-access-czcfr") pod "eb212bbc-3071-4fda-968d-b6d3f19996ee" (UID: "eb212bbc-3071-4fda-968d-b6d3f19996ee"). InnerVolumeSpecName "kube-api-access-czcfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.541816 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-config" (OuterVolumeSpecName: "config") pod "eb212bbc-3071-4fda-968d-b6d3f19996ee" (UID: "eb212bbc-3071-4fda-968d-b6d3f19996ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.545151 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb212bbc-3071-4fda-968d-b6d3f19996ee" (UID: "eb212bbc-3071-4fda-968d-b6d3f19996ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.557868 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb212bbc-3071-4fda-968d-b6d3f19996ee" (UID: "eb212bbc-3071-4fda-968d-b6d3f19996ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.557956 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb212bbc-3071-4fda-968d-b6d3f19996ee" (UID: "eb212bbc-3071-4fda-968d-b6d3f19996ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.592930 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czcfr\" (UniqueName: \"kubernetes.io/projected/eb212bbc-3071-4fda-968d-b6d3f19996ee-kube-api-access-czcfr\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.592968 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.592981 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.592992 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:07 crc kubenswrapper[4886]: I0129 17:07:07.593005 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb212bbc-3071-4fda-968d-b6d3f19996ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:08 crc kubenswrapper[4886]: I0129 17:07:08.110427 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t8rs7" event={"ID":"eb212bbc-3071-4fda-968d-b6d3f19996ee","Type":"ContainerDied","Data":"da2d61dccf59424cc14b54a614d36ae066f9a9d76b8f120a8702b08ed1b7f949"} Jan 29 17:07:08 crc kubenswrapper[4886]: I0129 17:07:08.110793 4886 scope.go:117] "RemoveContainer" containerID="54bdeb43a338f0b719b206ca212f50bc02c6d2592ec0ac66c6b8743631a3cf1b" Jan 29 17:07:08 crc kubenswrapper[4886]: I0129 17:07:08.110499 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-t8rs7" Jan 29 17:07:08 crc kubenswrapper[4886]: I0129 17:07:08.153364 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t8rs7"] Jan 29 17:07:08 crc kubenswrapper[4886]: I0129 17:07:08.166096 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t8rs7"] Jan 29 17:07:08 crc kubenswrapper[4886]: I0129 17:07:08.627908 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" path="/var/lib/kubelet/pods/eb212bbc-3071-4fda-968d-b6d3f19996ee/volumes" Jan 29 17:07:08 crc kubenswrapper[4886]: I0129 17:07:08.661225 4886 scope.go:117] "RemoveContainer" containerID="71b921e8db9e8e747c69aeafc44470b62e0400a32e8c7e760d1d991c175cbc64" Jan 29 17:07:11 crc kubenswrapper[4886]: I0129 17:07:11.507857 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t8rs7" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: i/o timeout" Jan 29 17:07:12 crc kubenswrapper[4886]: I0129 17:07:12.166143 4886 generic.go:334] "Generic (PLEG): container finished" podID="68cdc6ed-ce63-43af-8502-b36cc0ae788a" containerID="6375ad3e949f813db64562de4e61fa2910abcb717d2e211c509e5dbcb6b07f3a" exitCode=0 Jan 29 17:07:12 crc kubenswrapper[4886]: I0129 17:07:12.166194 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p924n" event={"ID":"68cdc6ed-ce63-43af-8502-b36cc0ae788a","Type":"ContainerDied","Data":"6375ad3e949f813db64562de4e61fa2910abcb717d2e211c509e5dbcb6b07f3a"} Jan 29 17:07:13 crc kubenswrapper[4886]: I0129 17:07:13.439786 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 17:07:13 crc kubenswrapper[4886]: I0129 17:07:13.440133 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 17:07:13 crc kubenswrapper[4886]: I0129 17:07:13.593258 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 17:07:13 crc kubenswrapper[4886]: I0129 17:07:13.596464 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.188743 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.188788 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.890201 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p924n" Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.965111 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-config-data\") pod \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.965994 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq47h\" (UniqueName: \"kubernetes.io/projected/68cdc6ed-ce63-43af-8502-b36cc0ae788a-kube-api-access-cq47h\") pod \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.966064 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-combined-ca-bundle\") pod \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.966153 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-credential-keys\") pod \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.966229 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-fernet-keys\") pod \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.966287 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-scripts\") pod \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\" (UID: \"68cdc6ed-ce63-43af-8502-b36cc0ae788a\") " Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.970575 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-scripts" (OuterVolumeSpecName: "scripts") pod "68cdc6ed-ce63-43af-8502-b36cc0ae788a" (UID: "68cdc6ed-ce63-43af-8502-b36cc0ae788a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.971579 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "68cdc6ed-ce63-43af-8502-b36cc0ae788a" (UID: "68cdc6ed-ce63-43af-8502-b36cc0ae788a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.971609 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "68cdc6ed-ce63-43af-8502-b36cc0ae788a" (UID: "68cdc6ed-ce63-43af-8502-b36cc0ae788a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:14 crc kubenswrapper[4886]: I0129 17:07:14.972879 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68cdc6ed-ce63-43af-8502-b36cc0ae788a-kube-api-access-cq47h" (OuterVolumeSpecName: "kube-api-access-cq47h") pod "68cdc6ed-ce63-43af-8502-b36cc0ae788a" (UID: "68cdc6ed-ce63-43af-8502-b36cc0ae788a"). InnerVolumeSpecName "kube-api-access-cq47h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.047671 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-config-data" (OuterVolumeSpecName: "config-data") pod "68cdc6ed-ce63-43af-8502-b36cc0ae788a" (UID: "68cdc6ed-ce63-43af-8502-b36cc0ae788a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.047762 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68cdc6ed-ce63-43af-8502-b36cc0ae788a" (UID: "68cdc6ed-ce63-43af-8502-b36cc0ae788a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.069022 4886 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.069051 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.069059 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.069068 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq47h\" (UniqueName: \"kubernetes.io/projected/68cdc6ed-ce63-43af-8502-b36cc0ae788a-kube-api-access-cq47h\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.069079 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.069088 4886 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68cdc6ed-ce63-43af-8502-b36cc0ae788a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.227550 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q2dxw" event={"ID":"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd","Type":"ContainerStarted","Data":"462d0b69d42ff5bdae3194985f827b482bb0c2607dbc772e35d27e51d1171c94"} Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.234668 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerStarted","Data":"2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf"} Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.236260 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6nmwn" event={"ID":"a0058f32-ae80-4dde-9dce-095c62f45979","Type":"ContainerStarted","Data":"ab83d2d0c36aaea48832e86668e20e1d6f6f876644014c27f52bee83b6960b7d"} Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.238617 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p924n" event={"ID":"68cdc6ed-ce63-43af-8502-b36cc0ae788a","Type":"ContainerDied","Data":"76b68b08b92b70f0de4c1a2319c04176b3479b075a2ab3366608b1fce7ae76ee"} Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.238645 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76b68b08b92b70f0de4c1a2319c04176b3479b075a2ab3366608b1fce7ae76ee" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.238711 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p924n" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.249169 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-q2dxw" podStartSLOduration=3.735655083 podStartE2EDuration="1m11.249151684s" podCreationTimestamp="2026-01-29 17:06:04 +0000 UTC" firstStartedPulling="2026-01-29 17:06:07.074218153 +0000 UTC m=+2649.982937415" lastFinishedPulling="2026-01-29 17:07:14.587714744 +0000 UTC m=+2717.496434016" observedRunningTime="2026-01-29 17:07:15.24581 +0000 UTC m=+2718.154529292" watchObservedRunningTime="2026-01-29 17:07:15.249151684 +0000 UTC m=+2718.157870956" Jan 29 17:07:15 crc kubenswrapper[4886]: I0129 17:07:15.275509 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-6nmwn" podStartSLOduration=2.780281854 podStartE2EDuration="1m11.275492216s" podCreationTimestamp="2026-01-29 17:06:04 +0000 UTC" firstStartedPulling="2026-01-29 17:06:06.093981444 +0000 UTC m=+2649.002700726" lastFinishedPulling="2026-01-29 17:07:14.589191816 +0000 UTC m=+2717.497911088" observedRunningTime="2026-01-29 17:07:15.261859182 +0000 UTC m=+2718.170578474" watchObservedRunningTime="2026-01-29 17:07:15.275492216 +0000 UTC m=+2718.184211488" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.015604 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5499bdc9-q6hr4"] Jan 29 17:07:16 crc kubenswrapper[4886]: E0129 17:07:16.016537 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerName="init" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.016550 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerName="init" Jan 29 17:07:16 crc kubenswrapper[4886]: E0129 17:07:16.016621 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cdc6ed-ce63-43af-8502-b36cc0ae788a" containerName="keystone-bootstrap" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.016628 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cdc6ed-ce63-43af-8502-b36cc0ae788a" containerName="keystone-bootstrap" Jan 29 17:07:16 crc kubenswrapper[4886]: E0129 17:07:16.016642 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerName="dnsmasq-dns" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.016648 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerName="dnsmasq-dns" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.016832 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb212bbc-3071-4fda-968d-b6d3f19996ee" containerName="dnsmasq-dns" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.016847 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cdc6ed-ce63-43af-8502-b36cc0ae788a" containerName="keystone-bootstrap" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.018174 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.020930 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.021195 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5qcd" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.021196 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.021513 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.024718 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.027299 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.044205 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5499bdc9-q6hr4"] Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.091725 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-credential-keys\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.091853 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-scripts\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.091910 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf5dx\" (UniqueName: \"kubernetes.io/projected/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-kube-api-access-vf5dx\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.091963 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-combined-ca-bundle\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.092016 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-public-tls-certs\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.092030 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-config-data\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.092066 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-internal-tls-certs\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.092134 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-fernet-keys\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.194083 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-scripts\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.194176 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf5dx\" (UniqueName: \"kubernetes.io/projected/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-kube-api-access-vf5dx\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.194252 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-combined-ca-bundle\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.194346 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-public-tls-certs\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.194376 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-config-data\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.194424 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-internal-tls-certs\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.194488 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-fernet-keys\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.194562 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-credential-keys\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.200154 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-credential-keys\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.200274 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-fernet-keys\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.201456 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-public-tls-certs\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.201585 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-combined-ca-bundle\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.202573 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-scripts\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.209401 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-internal-tls-certs\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.214121 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-config-data\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.242670 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf5dx\" (UniqueName: \"kubernetes.io/projected/d9e327b0-6e20-4b1d-a18f-64b8b49ef36d-kube-api-access-vf5dx\") pod \"keystone-5499bdc9-q6hr4\" (UID: \"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d\") " pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.250246 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j5gfz" event={"ID":"04dae116-ceca-4588-9cba-1266bfa92caf","Type":"ContainerStarted","Data":"09a30c5dfcb3deacf09e3ccec1c515a8213db072a4cbe06ac44ba60b9a7d0159"} Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.252208 4886 generic.go:334] "Generic (PLEG): container finished" podID="8923ac96-087a-425b-a8b4-c09aa4be3d78" containerID="b56f617415d312996740dc4a8697ef643e749e77f4339179492aab6c12f2f0d4" exitCode=0 Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.252247 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8m2mm" event={"ID":"8923ac96-087a-425b-a8b4-c09aa4be3d78","Type":"ContainerDied","Data":"b56f617415d312996740dc4a8697ef643e749e77f4339179492aab6c12f2f0d4"} Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.274727 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-j5gfz" podStartSLOduration=3.998321662 podStartE2EDuration="1m12.27470308s" podCreationTimestamp="2026-01-29 17:06:04 +0000 UTC" firstStartedPulling="2026-01-29 17:06:06.313123617 +0000 UTC m=+2649.221842879" lastFinishedPulling="2026-01-29 17:07:14.589505035 +0000 UTC m=+2717.498224297" observedRunningTime="2026-01-29 17:07:16.271421087 +0000 UTC m=+2719.180140359" watchObservedRunningTime="2026-01-29 17:07:16.27470308 +0000 UTC m=+2719.183422352" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.336392 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.863146 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5499bdc9-q6hr4"] Jan 29 17:07:16 crc kubenswrapper[4886]: W0129 17:07:16.902041 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9e327b0_6e20_4b1d_a18f_64b8b49ef36d.slice/crio-9228f11c2df4be09dbc3fbbbdbf63e80d8c682804d34491222b93f145af49788 WatchSource:0}: Error finding container 9228f11c2df4be09dbc3fbbbdbf63e80d8c682804d34491222b93f145af49788: Status 404 returned error can't find the container with id 9228f11c2df4be09dbc3fbbbdbf63e80d8c682804d34491222b93f145af49788 Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.907343 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.918896 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 17:07:16 crc kubenswrapper[4886]: I0129 17:07:16.918942 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.267526 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5499bdc9-q6hr4" event={"ID":"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d","Type":"ContainerStarted","Data":"8fb0484c6a214f05410ef82efa17abe7d106d7d860627a7ea48d168639c2ad83"} Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.268853 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.268941 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5499bdc9-q6hr4" event={"ID":"d9e327b0-6e20-4b1d-a18f-64b8b49ef36d","Type":"ContainerStarted","Data":"9228f11c2df4be09dbc3fbbbdbf63e80d8c682804d34491222b93f145af49788"} Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.286937 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5499bdc9-q6hr4" podStartSLOduration=2.286915719 podStartE2EDuration="2.286915719s" podCreationTimestamp="2026-01-29 17:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:17.284819 +0000 UTC m=+2720.193538282" watchObservedRunningTime="2026-01-29 17:07:17.286915719 +0000 UTC m=+2720.195634991" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.668488 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8m2mm" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.851052 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ckms\" (UniqueName: \"kubernetes.io/projected/8923ac96-087a-425b-a8b4-c09aa4be3d78-kube-api-access-8ckms\") pod \"8923ac96-087a-425b-a8b4-c09aa4be3d78\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.851481 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-config-data\") pod \"8923ac96-087a-425b-a8b4-c09aa4be3d78\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.851699 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-combined-ca-bundle\") pod \"8923ac96-087a-425b-a8b4-c09aa4be3d78\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.851887 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-scripts\") pod \"8923ac96-087a-425b-a8b4-c09aa4be3d78\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.852054 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8923ac96-087a-425b-a8b4-c09aa4be3d78-logs\") pod \"8923ac96-087a-425b-a8b4-c09aa4be3d78\" (UID: \"8923ac96-087a-425b-a8b4-c09aa4be3d78\") " Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.852997 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8923ac96-087a-425b-a8b4-c09aa4be3d78-logs" (OuterVolumeSpecName: "logs") pod "8923ac96-087a-425b-a8b4-c09aa4be3d78" (UID: "8923ac96-087a-425b-a8b4-c09aa4be3d78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.859242 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-scripts" (OuterVolumeSpecName: "scripts") pod "8923ac96-087a-425b-a8b4-c09aa4be3d78" (UID: "8923ac96-087a-425b-a8b4-c09aa4be3d78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.860530 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8923ac96-087a-425b-a8b4-c09aa4be3d78-kube-api-access-8ckms" (OuterVolumeSpecName: "kube-api-access-8ckms") pod "8923ac96-087a-425b-a8b4-c09aa4be3d78" (UID: "8923ac96-087a-425b-a8b4-c09aa4be3d78"). InnerVolumeSpecName "kube-api-access-8ckms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.880634 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8923ac96-087a-425b-a8b4-c09aa4be3d78" (UID: "8923ac96-087a-425b-a8b4-c09aa4be3d78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.904381 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-config-data" (OuterVolumeSpecName: "config-data") pod "8923ac96-087a-425b-a8b4-c09aa4be3d78" (UID: "8923ac96-087a-425b-a8b4-c09aa4be3d78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.955027 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.955071 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.955084 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8923ac96-087a-425b-a8b4-c09aa4be3d78-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.955096 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ckms\" (UniqueName: \"kubernetes.io/projected/8923ac96-087a-425b-a8b4-c09aa4be3d78-kube-api-access-8ckms\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:17 crc kubenswrapper[4886]: I0129 17:07:17.955110 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8923ac96-087a-425b-a8b4-c09aa4be3d78-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.281554 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8m2mm" event={"ID":"8923ac96-087a-425b-a8b4-c09aa4be3d78","Type":"ContainerDied","Data":"7ba3dd51612ec84b7435debfb27c88330b100c1320a10e3e0bea0e482e076cd8"} Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.281603 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8m2mm" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.281739 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ba3dd51612ec84b7435debfb27c88330b100c1320a10e3e0bea0e482e076cd8" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.431194 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-795d8c76d8-x2zqv"] Jan 29 17:07:18 crc kubenswrapper[4886]: E0129 17:07:18.431633 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8923ac96-087a-425b-a8b4-c09aa4be3d78" containerName="placement-db-sync" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.431650 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8923ac96-087a-425b-a8b4-c09aa4be3d78" containerName="placement-db-sync" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.431875 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8923ac96-087a-425b-a8b4-c09aa4be3d78" containerName="placement-db-sync" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.436879 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.446312 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.446596 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mrvvt" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.446707 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.446867 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.447437 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.458364 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-795d8c76d8-x2zqv"] Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.574749 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-combined-ca-bundle\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.574803 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-public-tls-certs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.574856 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-config-data\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.574915 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e13d48e-3469-4f76-8bae-ab1a21556f5a-logs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.574957 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-internal-tls-certs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.574998 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv2wf\" (UniqueName: \"kubernetes.io/projected/7e13d48e-3469-4f76-8bae-ab1a21556f5a-kube-api-access-jv2wf\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.575093 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-scripts\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.676493 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-combined-ca-bundle\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.676544 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-public-tls-certs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.676587 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-config-data\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.676623 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e13d48e-3469-4f76-8bae-ab1a21556f5a-logs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.676652 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-internal-tls-certs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.676683 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv2wf\" (UniqueName: \"kubernetes.io/projected/7e13d48e-3469-4f76-8bae-ab1a21556f5a-kube-api-access-jv2wf\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.676751 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-scripts\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.677539 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e13d48e-3469-4f76-8bae-ab1a21556f5a-logs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.689926 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-internal-tls-certs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.691633 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-config-data\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.692413 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-public-tls-certs\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.693157 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-combined-ca-bundle\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.700705 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e13d48e-3469-4f76-8bae-ab1a21556f5a-scripts\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.708101 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv2wf\" (UniqueName: \"kubernetes.io/projected/7e13d48e-3469-4f76-8bae-ab1a21556f5a-kube-api-access-jv2wf\") pod \"placement-795d8c76d8-x2zqv\" (UID: \"7e13d48e-3469-4f76-8bae-ab1a21556f5a\") " pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:18 crc kubenswrapper[4886]: I0129 17:07:18.799596 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:19 crc kubenswrapper[4886]: I0129 17:07:19.369399 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-795d8c76d8-x2zqv"] Jan 29 17:07:19 crc kubenswrapper[4886]: W0129 17:07:19.374518 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e13d48e_3469_4f76_8bae_ab1a21556f5a.slice/crio-02ef7ab85d551b8c7255372e2df6940c041d74302e7d6145a578475e935f0fc2 WatchSource:0}: Error finding container 02ef7ab85d551b8c7255372e2df6940c041d74302e7d6145a578475e935f0fc2: Status 404 returned error can't find the container with id 02ef7ab85d551b8c7255372e2df6940c041d74302e7d6145a578475e935f0fc2 Jan 29 17:07:19 crc kubenswrapper[4886]: I0129 17:07:19.668896 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 17:07:20 crc kubenswrapper[4886]: I0129 17:07:20.309949 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-795d8c76d8-x2zqv" event={"ID":"7e13d48e-3469-4f76-8bae-ab1a21556f5a","Type":"ContainerStarted","Data":"58e68e11ea532ee03604b1a3e5d94c5d1b6fff5c393f020ac0dcc0a7eb5b76a9"} Jan 29 17:07:20 crc kubenswrapper[4886]: I0129 17:07:20.310342 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-795d8c76d8-x2zqv" event={"ID":"7e13d48e-3469-4f76-8bae-ab1a21556f5a","Type":"ContainerStarted","Data":"65b3a2de2f2bfa8b452044b25f0f46c9c8d2cf5077cb7b5fb82f688d7f51c24d"} Jan 29 17:07:20 crc kubenswrapper[4886]: I0129 17:07:20.310363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-795d8c76d8-x2zqv" event={"ID":"7e13d48e-3469-4f76-8bae-ab1a21556f5a","Type":"ContainerStarted","Data":"02ef7ab85d551b8c7255372e2df6940c041d74302e7d6145a578475e935f0fc2"} Jan 29 17:07:20 crc kubenswrapper[4886]: I0129 17:07:20.310406 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:20 crc kubenswrapper[4886]: I0129 17:07:20.310434 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:20 crc kubenswrapper[4886]: I0129 17:07:20.338015 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-795d8c76d8-x2zqv" podStartSLOduration=2.337994294 podStartE2EDuration="2.337994294s" podCreationTimestamp="2026-01-29 17:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:20.332447608 +0000 UTC m=+2723.241166880" watchObservedRunningTime="2026-01-29 17:07:20.337994294 +0000 UTC m=+2723.246713556" Jan 29 17:07:29 crc kubenswrapper[4886]: I0129 17:07:29.418123 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerStarted","Data":"6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55"} Jan 29 17:07:29 crc kubenswrapper[4886]: I0129 17:07:29.418729 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:07:29 crc kubenswrapper[4886]: I0129 17:07:29.418439 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="sg-core" containerID="cri-o://2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf" gracePeriod=30 Jan 29 17:07:29 crc kubenswrapper[4886]: I0129 17:07:29.418367 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="ceilometer-central-agent" containerID="cri-o://6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4" gracePeriod=30 Jan 29 17:07:29 crc kubenswrapper[4886]: I0129 17:07:29.418496 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="ceilometer-notification-agent" containerID="cri-o://fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927" gracePeriod=30 Jan 29 17:07:29 crc kubenswrapper[4886]: I0129 17:07:29.418523 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="proxy-httpd" containerID="cri-o://6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55" gracePeriod=30 Jan 29 17:07:29 crc kubenswrapper[4886]: I0129 17:07:29.454070 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.666899876 podStartE2EDuration="1m25.454051854s" podCreationTimestamp="2026-01-29 17:06:04 +0000 UTC" firstStartedPulling="2026-01-29 17:06:07.088088704 +0000 UTC m=+2649.996807966" lastFinishedPulling="2026-01-29 17:07:28.875240682 +0000 UTC m=+2731.783959944" observedRunningTime="2026-01-29 17:07:29.451936645 +0000 UTC m=+2732.360655917" watchObservedRunningTime="2026-01-29 17:07:29.454051854 +0000 UTC m=+2732.362771146" Jan 29 17:07:30 crc kubenswrapper[4886]: I0129 17:07:30.432735 4886 generic.go:334] "Generic (PLEG): container finished" podID="87986c31-37d7-4624-87a2-b5678e01d865" containerID="6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55" exitCode=0 Jan 29 17:07:30 crc kubenswrapper[4886]: I0129 17:07:30.433130 4886 generic.go:334] "Generic (PLEG): container finished" podID="87986c31-37d7-4624-87a2-b5678e01d865" containerID="2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf" exitCode=2 Jan 29 17:07:30 crc kubenswrapper[4886]: I0129 17:07:30.433152 4886 generic.go:334] "Generic (PLEG): container finished" podID="87986c31-37d7-4624-87a2-b5678e01d865" containerID="6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4" exitCode=0 Jan 29 17:07:30 crc kubenswrapper[4886]: I0129 17:07:30.432825 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerDied","Data":"6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55"} Jan 29 17:07:30 crc kubenswrapper[4886]: I0129 17:07:30.433200 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerDied","Data":"2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf"} Jan 29 17:07:30 crc kubenswrapper[4886]: I0129 17:07:30.433229 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerDied","Data":"6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4"} Jan 29 17:07:32 crc kubenswrapper[4886]: I0129 17:07:32.457532 4886 generic.go:334] "Generic (PLEG): container finished" podID="ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" containerID="462d0b69d42ff5bdae3194985f827b482bb0c2607dbc772e35d27e51d1171c94" exitCode=0 Jan 29 17:07:32 crc kubenswrapper[4886]: I0129 17:07:32.457622 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q2dxw" event={"ID":"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd","Type":"ContainerDied","Data":"462d0b69d42ff5bdae3194985f827b482bb0c2607dbc772e35d27e51d1171c94"} Jan 29 17:07:33 crc kubenswrapper[4886]: I0129 17:07:33.941372 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.037598 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86p7n\" (UniqueName: \"kubernetes.io/projected/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-kube-api-access-86p7n\") pod \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.037855 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-db-sync-config-data\") pod \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.037932 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-combined-ca-bundle\") pod \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\" (UID: \"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd\") " Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.054921 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" (UID: "ffb099fb-7bdb-4969-b3cb-6fc4ef498afd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.055001 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-kube-api-access-86p7n" (OuterVolumeSpecName: "kube-api-access-86p7n") pod "ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" (UID: "ffb099fb-7bdb-4969-b3cb-6fc4ef498afd"). InnerVolumeSpecName "kube-api-access-86p7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.070861 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" (UID: "ffb099fb-7bdb-4969-b3cb-6fc4ef498afd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.140034 4886 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.140073 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.140088 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86p7n\" (UniqueName: \"kubernetes.io/projected/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd-kube-api-access-86p7n\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.486983 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q2dxw" event={"ID":"ffb099fb-7bdb-4969-b3cb-6fc4ef498afd","Type":"ContainerDied","Data":"474a2d0d1c07609e70e6ff2d358c4e7ec5598344e910e4e2e3ec3d713255b48d"} Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.487022 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="474a2d0d1c07609e70e6ff2d358c4e7ec5598344e910e4e2e3ec3d713255b48d" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.487638 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q2dxw" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.877726 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-f4657cb95-4tfvc"] Jan 29 17:07:34 crc kubenswrapper[4886]: E0129 17:07:34.878499 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" containerName="barbican-db-sync" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.878520 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" containerName="barbican-db-sync" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.878735 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" containerName="barbican-db-sync" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.879840 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.883555 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.883725 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.883850 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5k8bj" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.892637 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-85cc5d579d-jhqqd"] Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.894349 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.898661 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.921761 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f4657cb95-4tfvc"] Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.957815 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-config-data-custom\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.957867 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-config-data\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.957951 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-combined-ca-bundle\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.958068 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-config-data-custom\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.958115 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-config-data\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.958203 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054e527c-8ce1-4d03-8fef-0430934daba3-logs\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.958298 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrxr\" (UniqueName: \"kubernetes.io/projected/8f83894a-73ec-405a-bdd2-2044b3f9140a-kube-api-access-rcrxr\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.958384 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-combined-ca-bundle\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.958566 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsqnr\" (UniqueName: \"kubernetes.io/projected/054e527c-8ce1-4d03-8fef-0430934daba3-kube-api-access-xsqnr\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.958592 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f83894a-73ec-405a-bdd2-2044b3f9140a-logs\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.966405 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-85cc5d579d-jhqqd"] Jan 29 17:07:34 crc kubenswrapper[4886]: I0129 17:07:34.999154 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-jsg5q"] Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.001298 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.004533 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-jsg5q"] Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.063774 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054e527c-8ce1-4d03-8fef-0430934daba3-logs\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064012 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcrxr\" (UniqueName: \"kubernetes.io/projected/8f83894a-73ec-405a-bdd2-2044b3f9140a-kube-api-access-rcrxr\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064092 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-combined-ca-bundle\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064216 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsqnr\" (UniqueName: \"kubernetes.io/projected/054e527c-8ce1-4d03-8fef-0430934daba3-kube-api-access-xsqnr\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064287 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f83894a-73ec-405a-bdd2-2044b3f9140a-logs\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064404 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064500 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-config-data-custom\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064577 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-config-data\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064668 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-config\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.067435 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.067669 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-combined-ca-bundle\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.067773 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.067964 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-config-data-custom\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.068086 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-config-data\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.068172 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzzbl\" (UniqueName: \"kubernetes.io/projected/9ac97bdb-475a-4061-96b0-1423be10bb5b-kube-api-access-tzzbl\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.068247 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.076762 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-config-data\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.064337 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054e527c-8ce1-4d03-8fef-0430934daba3-logs\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.065311 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f83894a-73ec-405a-bdd2-2044b3f9140a-logs\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.079584 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-combined-ca-bundle\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.087008 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-config-data-custom\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.092687 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-combined-ca-bundle\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.094184 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f83894a-73ec-405a-bdd2-2044b3f9140a-config-data\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.095718 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcrxr\" (UniqueName: \"kubernetes.io/projected/8f83894a-73ec-405a-bdd2-2044b3f9140a-kube-api-access-rcrxr\") pod \"barbican-worker-f4657cb95-4tfvc\" (UID: \"8f83894a-73ec-405a-bdd2-2044b3f9140a\") " pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.098566 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsqnr\" (UniqueName: \"kubernetes.io/projected/054e527c-8ce1-4d03-8fef-0430934daba3-kube-api-access-xsqnr\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.114785 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/054e527c-8ce1-4d03-8fef-0430934daba3-config-data-custom\") pod \"barbican-keystone-listener-85cc5d579d-jhqqd\" (UID: \"054e527c-8ce1-4d03-8fef-0430934daba3\") " pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.119829 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55f7ff7dd6-jj4jw"] Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.121476 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.127781 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.141611 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55f7ff7dd6-jj4jw"] Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171133 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-combined-ca-bundle\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171195 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzzbl\" (UniqueName: \"kubernetes.io/projected/9ac97bdb-475a-4061-96b0-1423be10bb5b-kube-api-access-tzzbl\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171226 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171443 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171484 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171557 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-config\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171596 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171641 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171675 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25kn5\" (UniqueName: \"kubernetes.io/projected/ea36feff-2438-49e4-b779-0b083addd0a8-kube-api-access-25kn5\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171695 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data-custom\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.171730 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea36feff-2438-49e4-b779-0b083addd0a8-logs\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.172441 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.172478 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.172478 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-config\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.172747 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.173145 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.188615 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzzbl\" (UniqueName: \"kubernetes.io/projected/9ac97bdb-475a-4061-96b0-1423be10bb5b-kube-api-access-tzzbl\") pod \"dnsmasq-dns-586bdc5f9-jsg5q\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.219749 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f4657cb95-4tfvc" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.228613 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.273574 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.273725 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25kn5\" (UniqueName: \"kubernetes.io/projected/ea36feff-2438-49e4-b779-0b083addd0a8-kube-api-access-25kn5\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.273773 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data-custom\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.273827 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea36feff-2438-49e4-b779-0b083addd0a8-logs\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.273875 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-combined-ca-bundle\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.277930 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-combined-ca-bundle\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.281176 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.283928 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data-custom\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.285020 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea36feff-2438-49e4-b779-0b083addd0a8-logs\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.313550 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25kn5\" (UniqueName: \"kubernetes.io/projected/ea36feff-2438-49e4-b779-0b083addd0a8-kube-api-access-25kn5\") pod \"barbican-api-55f7ff7dd6-jj4jw\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.333514 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.383498 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:35 crc kubenswrapper[4886]: I0129 17:07:35.863769 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f4657cb95-4tfvc"] Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.026830 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-jsg5q"] Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.039573 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-85cc5d579d-jhqqd"] Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.243387 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55f7ff7dd6-jj4jw"] Jan 29 17:07:36 crc kubenswrapper[4886]: W0129 17:07:36.302360 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea36feff_2438_49e4_b779_0b083addd0a8.slice/crio-e9dafe9a7a14455f6d6567489f608749fce9a0af4812468a1f99388ab4f30929 WatchSource:0}: Error finding container e9dafe9a7a14455f6d6567489f608749fce9a0af4812468a1f99388ab4f30929: Status 404 returned error can't find the container with id e9dafe9a7a14455f6d6567489f608749fce9a0af4812468a1f99388ab4f30929 Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.533028 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.595963 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f4657cb95-4tfvc" event={"ID":"8f83894a-73ec-405a-bdd2-2044b3f9140a","Type":"ContainerStarted","Data":"c6e79ae953c0f36ce267680773fd96b75453c6a1745545d5e448c84519c6cdae"} Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.604025 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" event={"ID":"054e527c-8ce1-4d03-8fef-0430934daba3","Type":"ContainerStarted","Data":"e62654d928fcfc926c64b76e5a652ed2c3fb2b029b9bd38eabebb8f8d2e377c1"} Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.606022 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" event={"ID":"9ac97bdb-475a-4061-96b0-1423be10bb5b","Type":"ContainerStarted","Data":"1724f7bc6805ebdf2ea8515900b97a42430de51ca57fd28deec62f818f0909c2"} Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.607754 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" event={"ID":"ea36feff-2438-49e4-b779-0b083addd0a8","Type":"ContainerStarted","Data":"e9dafe9a7a14455f6d6567489f608749fce9a0af4812468a1f99388ab4f30929"} Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.609943 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-config-data\") pod \"87986c31-37d7-4624-87a2-b5678e01d865\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.610085 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-scripts\") pod \"87986c31-37d7-4624-87a2-b5678e01d865\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.610143 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-sg-core-conf-yaml\") pod \"87986c31-37d7-4624-87a2-b5678e01d865\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.610203 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-run-httpd\") pod \"87986c31-37d7-4624-87a2-b5678e01d865\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.610284 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-log-httpd\") pod \"87986c31-37d7-4624-87a2-b5678e01d865\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.610399 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4459b\" (UniqueName: \"kubernetes.io/projected/87986c31-37d7-4624-87a2-b5678e01d865-kube-api-access-4459b\") pod \"87986c31-37d7-4624-87a2-b5678e01d865\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.610421 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle\") pod \"87986c31-37d7-4624-87a2-b5678e01d865\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.613271 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87986c31-37d7-4624-87a2-b5678e01d865" (UID: "87986c31-37d7-4624-87a2-b5678e01d865"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.613503 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87986c31-37d7-4624-87a2-b5678e01d865" (UID: "87986c31-37d7-4624-87a2-b5678e01d865"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.614039 4886 generic.go:334] "Generic (PLEG): container finished" podID="87986c31-37d7-4624-87a2-b5678e01d865" containerID="fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927" exitCode=0 Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.614949 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.632534 4886 generic.go:334] "Generic (PLEG): container finished" podID="a0058f32-ae80-4dde-9dce-095c62f45979" containerID="ab83d2d0c36aaea48832e86668e20e1d6f6f876644014c27f52bee83b6960b7d" exitCode=0 Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.644082 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-scripts" (OuterVolumeSpecName: "scripts") pod "87986c31-37d7-4624-87a2-b5678e01d865" (UID: "87986c31-37d7-4624-87a2-b5678e01d865"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.644146 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87986c31-37d7-4624-87a2-b5678e01d865-kube-api-access-4459b" (OuterVolumeSpecName: "kube-api-access-4459b") pod "87986c31-37d7-4624-87a2-b5678e01d865" (UID: "87986c31-37d7-4624-87a2-b5678e01d865"). InnerVolumeSpecName "kube-api-access-4459b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.668617 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerDied","Data":"fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927"} Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.668739 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87986c31-37d7-4624-87a2-b5678e01d865","Type":"ContainerDied","Data":"3e6ce925c7e7561fcefff1c9869e186415899419d2d1d24db82a0097aea34d23"} Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.668755 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6nmwn" event={"ID":"a0058f32-ae80-4dde-9dce-095c62f45979","Type":"ContainerDied","Data":"ab83d2d0c36aaea48832e86668e20e1d6f6f876644014c27f52bee83b6960b7d"} Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.668779 4886 scope.go:117] "RemoveContainer" containerID="6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.689846 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87986c31-37d7-4624-87a2-b5678e01d865" (UID: "87986c31-37d7-4624-87a2-b5678e01d865"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.712694 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.712718 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87986c31-37d7-4624-87a2-b5678e01d865-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.712728 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4459b\" (UniqueName: \"kubernetes.io/projected/87986c31-37d7-4624-87a2-b5678e01d865-kube-api-access-4459b\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.712737 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.712745 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.751163 4886 scope.go:117] "RemoveContainer" containerID="2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf" Jan 29 17:07:36 crc kubenswrapper[4886]: E0129 17:07:36.764812 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle podName:87986c31-37d7-4624-87a2-b5678e01d865 nodeName:}" failed. No retries permitted until 2026-01-29 17:07:37.264786205 +0000 UTC m=+2740.173505477 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle") pod "87986c31-37d7-4624-87a2-b5678e01d865" (UID: "87986c31-37d7-4624-87a2-b5678e01d865") : error deleting /var/lib/kubelet/pods/87986c31-37d7-4624-87a2-b5678e01d865/volume-subpaths: remove /var/lib/kubelet/pods/87986c31-37d7-4624-87a2-b5678e01d865/volume-subpaths: no such file or directory Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.767794 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-config-data" (OuterVolumeSpecName: "config-data") pod "87986c31-37d7-4624-87a2-b5678e01d865" (UID: "87986c31-37d7-4624-87a2-b5678e01d865"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.773559 4886 scope.go:117] "RemoveContainer" containerID="fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.801877 4886 scope.go:117] "RemoveContainer" containerID="6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.815737 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.828260 4886 scope.go:117] "RemoveContainer" containerID="6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55" Jan 29 17:07:36 crc kubenswrapper[4886]: E0129 17:07:36.829526 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55\": container with ID starting with 6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55 not found: ID does not exist" containerID="6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.829590 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55"} err="failed to get container status \"6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55\": rpc error: code = NotFound desc = could not find container \"6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55\": container with ID starting with 6996141f6a6ddf86f1830cd32cfa7315a6d22f9c619ba74af481f02099316d55 not found: ID does not exist" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.829655 4886 scope.go:117] "RemoveContainer" containerID="2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf" Jan 29 17:07:36 crc kubenswrapper[4886]: E0129 17:07:36.830044 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf\": container with ID starting with 2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf not found: ID does not exist" containerID="2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.830071 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf"} err="failed to get container status \"2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf\": rpc error: code = NotFound desc = could not find container \"2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf\": container with ID starting with 2af8246b154ee39fedcfdd8e1579a14d1154c4bc23cb6682bb1d0354640c6bcf not found: ID does not exist" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.830093 4886 scope.go:117] "RemoveContainer" containerID="fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927" Jan 29 17:07:36 crc kubenswrapper[4886]: E0129 17:07:36.833312 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927\": container with ID starting with fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927 not found: ID does not exist" containerID="fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.833385 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927"} err="failed to get container status \"fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927\": rpc error: code = NotFound desc = could not find container \"fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927\": container with ID starting with fc4b86cf717b23c7c04aaa4106c7da0d6d9a36f8580e8da13099630ec38cb927 not found: ID does not exist" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.833416 4886 scope.go:117] "RemoveContainer" containerID="6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4" Jan 29 17:07:36 crc kubenswrapper[4886]: E0129 17:07:36.835885 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4\": container with ID starting with 6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4 not found: ID does not exist" containerID="6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4" Jan 29 17:07:36 crc kubenswrapper[4886]: I0129 17:07:36.835928 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4"} err="failed to get container status \"6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4\": rpc error: code = NotFound desc = could not find container \"6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4\": container with ID starting with 6528db29d7d5821f74fc120a90a127f94065eb87d3cb30310e3e2849cde918e4 not found: ID does not exist" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.324927 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle\") pod \"87986c31-37d7-4624-87a2-b5678e01d865\" (UID: \"87986c31-37d7-4624-87a2-b5678e01d865\") " Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.344534 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87986c31-37d7-4624-87a2-b5678e01d865" (UID: "87986c31-37d7-4624-87a2-b5678e01d865"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.427862 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87986c31-37d7-4624-87a2-b5678e01d865-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.557773 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.582930 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.594399 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:07:37 crc kubenswrapper[4886]: E0129 17:07:37.594845 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="proxy-httpd" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.594862 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="proxy-httpd" Jan 29 17:07:37 crc kubenswrapper[4886]: E0129 17:07:37.594891 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="sg-core" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.594898 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="sg-core" Jan 29 17:07:37 crc kubenswrapper[4886]: E0129 17:07:37.594910 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="ceilometer-notification-agent" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.594920 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="ceilometer-notification-agent" Jan 29 17:07:37 crc kubenswrapper[4886]: E0129 17:07:37.594936 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="ceilometer-central-agent" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.594942 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="ceilometer-central-agent" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.595187 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="proxy-httpd" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.595207 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="sg-core" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.595222 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="ceilometer-notification-agent" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.595235 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="87986c31-37d7-4624-87a2-b5678e01d865" containerName="ceilometer-central-agent" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.597790 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.604510 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.604531 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.615491 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.648316 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-log-httpd\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.648417 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-run-httpd\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.648462 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-scripts\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.648490 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.648511 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-config-data\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.648531 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.648560 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/24e9fd03-4a7f-45c7-83e6-608ad7648766-kube-api-access-5kkf6\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.673045 4886 generic.go:334] "Generic (PLEG): container finished" podID="9ac97bdb-475a-4061-96b0-1423be10bb5b" containerID="d6011c232b01e3892826684cea65e05a2b5a15c43a2d859d545b9c20ac294a14" exitCode=0 Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.673098 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" event={"ID":"9ac97bdb-475a-4061-96b0-1423be10bb5b","Type":"ContainerDied","Data":"d6011c232b01e3892826684cea65e05a2b5a15c43a2d859d545b9c20ac294a14"} Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.674507 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" event={"ID":"ea36feff-2438-49e4-b779-0b083addd0a8","Type":"ContainerStarted","Data":"8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2"} Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.674545 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" event={"ID":"ea36feff-2438-49e4-b779-0b083addd0a8","Type":"ContainerStarted","Data":"f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5"} Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.674601 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.674626 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.721783 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" podStartSLOduration=2.721760349 podStartE2EDuration="2.721760349s" podCreationTimestamp="2026-01-29 17:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:37.714531746 +0000 UTC m=+2740.623251038" watchObservedRunningTime="2026-01-29 17:07:37.721760349 +0000 UTC m=+2740.630479621" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.750481 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-scripts\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.750579 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.751542 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-config-data\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.751574 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.751614 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/24e9fd03-4a7f-45c7-83e6-608ad7648766-kube-api-access-5kkf6\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.751774 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-log-httpd\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.751857 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-run-httpd\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.752280 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-run-httpd\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.753519 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-log-httpd\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.757095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-config-data\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.760505 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.768570 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.769058 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/24e9fd03-4a7f-45c7-83e6-608ad7648766-kube-api-access-5kkf6\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.784212 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-scripts\") pod \"ceilometer-0\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " pod="openstack/ceilometer-0" Jan 29 17:07:37 crc kubenswrapper[4886]: I0129 17:07:37.951125 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.036681 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fb894ff6d-w7s26"] Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.038790 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.050926 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.051155 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.054851 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fb894ff6d-w7s26"] Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.161160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-internal-tls-certs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.161227 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87936a5-19e1-4a58-948f-1f569c08bb6b-logs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.161432 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-public-tls-certs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.161545 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw87g\" (UniqueName: \"kubernetes.io/projected/b87936a5-19e1-4a58-948f-1f569c08bb6b-kube-api-access-fw87g\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.161567 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-config-data-custom\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.161750 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-combined-ca-bundle\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.161880 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-config-data\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.217964 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6nmwn" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.263528 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-combined-ca-bundle\") pod \"a0058f32-ae80-4dde-9dce-095c62f45979\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.263582 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-config-data\") pod \"a0058f32-ae80-4dde-9dce-095c62f45979\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.263645 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v7hl\" (UniqueName: \"kubernetes.io/projected/a0058f32-ae80-4dde-9dce-095c62f45979-kube-api-access-9v7hl\") pod \"a0058f32-ae80-4dde-9dce-095c62f45979\" (UID: \"a0058f32-ae80-4dde-9dce-095c62f45979\") " Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.264044 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-config-data\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.264105 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-internal-tls-certs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.264141 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87936a5-19e1-4a58-948f-1f569c08bb6b-logs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.264189 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-public-tls-certs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.264239 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw87g\" (UniqueName: \"kubernetes.io/projected/b87936a5-19e1-4a58-948f-1f569c08bb6b-kube-api-access-fw87g\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.264255 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-config-data-custom\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.264377 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-combined-ca-bundle\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.265041 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87936a5-19e1-4a58-948f-1f569c08bb6b-logs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.269517 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-internal-tls-certs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.270887 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-combined-ca-bundle\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.271435 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-config-data-custom\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.272724 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-config-data\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.274815 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87936a5-19e1-4a58-948f-1f569c08bb6b-public-tls-certs\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.275556 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0058f32-ae80-4dde-9dce-095c62f45979-kube-api-access-9v7hl" (OuterVolumeSpecName: "kube-api-access-9v7hl") pod "a0058f32-ae80-4dde-9dce-095c62f45979" (UID: "a0058f32-ae80-4dde-9dce-095c62f45979"). InnerVolumeSpecName "kube-api-access-9v7hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.288105 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw87g\" (UniqueName: \"kubernetes.io/projected/b87936a5-19e1-4a58-948f-1f569c08bb6b-kube-api-access-fw87g\") pod \"barbican-api-5fb894ff6d-w7s26\" (UID: \"b87936a5-19e1-4a58-948f-1f569c08bb6b\") " pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.306164 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0058f32-ae80-4dde-9dce-095c62f45979" (UID: "a0058f32-ae80-4dde-9dce-095c62f45979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.367990 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.368021 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v7hl\" (UniqueName: \"kubernetes.io/projected/a0058f32-ae80-4dde-9dce-095c62f45979-kube-api-access-9v7hl\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.375239 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-config-data" (OuterVolumeSpecName: "config-data") pod "a0058f32-ae80-4dde-9dce-095c62f45979" (UID: "a0058f32-ae80-4dde-9dce-095c62f45979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.470845 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0058f32-ae80-4dde-9dce-095c62f45979-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.504568 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:38 crc kubenswrapper[4886]: W0129 17:07:38.550503 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24e9fd03_4a7f_45c7_83e6_608ad7648766.slice/crio-92751cfdf549c65a3a37a865694b9ce91879a5f41c663c775080337b3acc7481 WatchSource:0}: Error finding container 92751cfdf549c65a3a37a865694b9ce91879a5f41c663c775080337b3acc7481: Status 404 returned error can't find the container with id 92751cfdf549c65a3a37a865694b9ce91879a5f41c663c775080337b3acc7481 Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.555076 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.644019 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87986c31-37d7-4624-87a2-b5678e01d865" path="/var/lib/kubelet/pods/87986c31-37d7-4624-87a2-b5678e01d865/volumes" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.694044 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6nmwn" event={"ID":"a0058f32-ae80-4dde-9dce-095c62f45979","Type":"ContainerDied","Data":"d9df74376035a2b4e196d856e8d76469a75a91514ac671f314bd4926926ee2e3"} Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.694091 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9df74376035a2b4e196d856e8d76469a75a91514ac671f314bd4926926ee2e3" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.694058 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6nmwn" Jan 29 17:07:38 crc kubenswrapper[4886]: I0129 17:07:38.696802 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerStarted","Data":"92751cfdf549c65a3a37a865694b9ce91879a5f41c663c775080337b3acc7481"} Jan 29 17:07:39 crc kubenswrapper[4886]: I0129 17:07:39.030931 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fb894ff6d-w7s26"] Jan 29 17:07:39 crc kubenswrapper[4886]: I0129 17:07:39.708446 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" event={"ID":"9ac97bdb-475a-4061-96b0-1423be10bb5b","Type":"ContainerStarted","Data":"a528683376327e5804a4ea1ec553e70518415fe775e3feb358ab1099f935a1fb"} Jan 29 17:07:39 crc kubenswrapper[4886]: I0129 17:07:39.708796 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:39 crc kubenswrapper[4886]: I0129 17:07:39.734773 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" podStartSLOduration=5.734753896 podStartE2EDuration="5.734753896s" podCreationTimestamp="2026-01-29 17:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:39.725469865 +0000 UTC m=+2742.634189137" watchObservedRunningTime="2026-01-29 17:07:39.734753896 +0000 UTC m=+2742.643473168" Jan 29 17:07:39 crc kubenswrapper[4886]: W0129 17:07:39.848455 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87936a5_19e1_4a58_948f_1f569c08bb6b.slice/crio-5670a05f4acee95bf1f3b5e9db23d52bf751ed4a054baf63e3e9aace49a37d13 WatchSource:0}: Error finding container 5670a05f4acee95bf1f3b5e9db23d52bf751ed4a054baf63e3e9aace49a37d13: Status 404 returned error can't find the container with id 5670a05f4acee95bf1f3b5e9db23d52bf751ed4a054baf63e3e9aace49a37d13 Jan 29 17:07:40 crc kubenswrapper[4886]: I0129 17:07:40.719799 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fb894ff6d-w7s26" event={"ID":"b87936a5-19e1-4a58-948f-1f569c08bb6b","Type":"ContainerStarted","Data":"5670a05f4acee95bf1f3b5e9db23d52bf751ed4a054baf63e3e9aace49a37d13"} Jan 29 17:07:41 crc kubenswrapper[4886]: I0129 17:07:41.732494 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fb894ff6d-w7s26" event={"ID":"b87936a5-19e1-4a58-948f-1f569c08bb6b","Type":"ContainerStarted","Data":"75e585ba6e31872c391d4f021d333f4dc8414bf7f94dc2577e762cfee1d307f3"} Jan 29 17:07:41 crc kubenswrapper[4886]: I0129 17:07:41.734964 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerStarted","Data":"472df94bcf2c9160f704fb8f0e7681c07c27ea44d994460b0bfef6434e9a5bfa"} Jan 29 17:07:41 crc kubenswrapper[4886]: I0129 17:07:41.736584 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f4657cb95-4tfvc" event={"ID":"8f83894a-73ec-405a-bdd2-2044b3f9140a","Type":"ContainerStarted","Data":"4f672b9ba40814a9dd3c3a838059715e007cf5e911ed8e940e56c86de2273636"} Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.752470 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fb894ff6d-w7s26" event={"ID":"b87936a5-19e1-4a58-948f-1f569c08bb6b","Type":"ContainerStarted","Data":"361c448a89a664088cb620037ad2edbb0b1c2b53501090897700deca3cf05ec1"} Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.753088 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.753127 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.776597 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerStarted","Data":"1bdf46565ca1048aaf33d2e55676cc44132df701332d9cac871024cf7e0601b1"} Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.792039 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fb894ff6d-w7s26" podStartSLOduration=5.792021396 podStartE2EDuration="5.792021396s" podCreationTimestamp="2026-01-29 17:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:42.786679735 +0000 UTC m=+2745.695399007" watchObservedRunningTime="2026-01-29 17:07:42.792021396 +0000 UTC m=+2745.700740668" Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.793392 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f4657cb95-4tfvc" event={"ID":"8f83894a-73ec-405a-bdd2-2044b3f9140a","Type":"ContainerStarted","Data":"a134eb869b542799f5b8ee4915f6e2f42dae3c4d8dc9c506e22973bc89774628"} Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.800456 4886 generic.go:334] "Generic (PLEG): container finished" podID="04dae116-ceca-4588-9cba-1266bfa92caf" containerID="09a30c5dfcb3deacf09e3ccec1c515a8213db072a4cbe06ac44ba60b9a7d0159" exitCode=0 Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.800531 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j5gfz" event={"ID":"04dae116-ceca-4588-9cba-1266bfa92caf","Type":"ContainerDied","Data":"09a30c5dfcb3deacf09e3ccec1c515a8213db072a4cbe06ac44ba60b9a7d0159"} Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.815519 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-f4657cb95-4tfvc" podStartSLOduration=4.253467625 podStartE2EDuration="8.815499717s" podCreationTimestamp="2026-01-29 17:07:34 +0000 UTC" firstStartedPulling="2026-01-29 17:07:35.919972361 +0000 UTC m=+2738.828691623" lastFinishedPulling="2026-01-29 17:07:40.482004443 +0000 UTC m=+2743.390723715" observedRunningTime="2026-01-29 17:07:42.812077731 +0000 UTC m=+2745.720797003" watchObservedRunningTime="2026-01-29 17:07:42.815499717 +0000 UTC m=+2745.724218979" Jan 29 17:07:42 crc kubenswrapper[4886]: I0129 17:07:42.822990 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" event={"ID":"054e527c-8ce1-4d03-8fef-0430934daba3","Type":"ContainerStarted","Data":"a8e7e2bd6b3cda1bfc8f6441f00f6807a7324ed4d8e27f36ee1ce8a6f9f49cfe"} Jan 29 17:07:43 crc kubenswrapper[4886]: I0129 17:07:43.846771 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" event={"ID":"054e527c-8ce1-4d03-8fef-0430934daba3","Type":"ContainerStarted","Data":"39b79bf84cb88167d5c8bac93b91dc7b502f104f2e0d2c0fcc75c3fc93973f4e"} Jan 29 17:07:43 crc kubenswrapper[4886]: I0129 17:07:43.873216 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-85cc5d579d-jhqqd" podStartSLOduration=3.697965619 podStartE2EDuration="9.873195628s" podCreationTimestamp="2026-01-29 17:07:34 +0000 UTC" firstStartedPulling="2026-01-29 17:07:36.042686707 +0000 UTC m=+2738.951405979" lastFinishedPulling="2026-01-29 17:07:42.217916706 +0000 UTC m=+2745.126635988" observedRunningTime="2026-01-29 17:07:43.865215353 +0000 UTC m=+2746.773934645" watchObservedRunningTime="2026-01-29 17:07:43.873195628 +0000 UTC m=+2746.781914910" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.343283 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.429293 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-config-data\") pod \"04dae116-ceca-4588-9cba-1266bfa92caf\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.429461 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rkdq\" (UniqueName: \"kubernetes.io/projected/04dae116-ceca-4588-9cba-1266bfa92caf-kube-api-access-2rkdq\") pod \"04dae116-ceca-4588-9cba-1266bfa92caf\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.429528 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-scripts\") pod \"04dae116-ceca-4588-9cba-1266bfa92caf\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.429629 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-db-sync-config-data\") pod \"04dae116-ceca-4588-9cba-1266bfa92caf\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.429822 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-combined-ca-bundle\") pod \"04dae116-ceca-4588-9cba-1266bfa92caf\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.429920 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04dae116-ceca-4588-9cba-1266bfa92caf-etc-machine-id\") pod \"04dae116-ceca-4588-9cba-1266bfa92caf\" (UID: \"04dae116-ceca-4588-9cba-1266bfa92caf\") " Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.430642 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04dae116-ceca-4588-9cba-1266bfa92caf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "04dae116-ceca-4588-9cba-1266bfa92caf" (UID: "04dae116-ceca-4588-9cba-1266bfa92caf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.435215 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04dae116-ceca-4588-9cba-1266bfa92caf-kube-api-access-2rkdq" (OuterVolumeSpecName: "kube-api-access-2rkdq") pod "04dae116-ceca-4588-9cba-1266bfa92caf" (UID: "04dae116-ceca-4588-9cba-1266bfa92caf"). InnerVolumeSpecName "kube-api-access-2rkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.435593 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-scripts" (OuterVolumeSpecName: "scripts") pod "04dae116-ceca-4588-9cba-1266bfa92caf" (UID: "04dae116-ceca-4588-9cba-1266bfa92caf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.439438 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "04dae116-ceca-4588-9cba-1266bfa92caf" (UID: "04dae116-ceca-4588-9cba-1266bfa92caf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.472110 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04dae116-ceca-4588-9cba-1266bfa92caf" (UID: "04dae116-ceca-4588-9cba-1266bfa92caf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.498579 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-config-data" (OuterVolumeSpecName: "config-data") pod "04dae116-ceca-4588-9cba-1266bfa92caf" (UID: "04dae116-ceca-4588-9cba-1266bfa92caf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.532478 4886 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04dae116-ceca-4588-9cba-1266bfa92caf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.532523 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.532567 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rkdq\" (UniqueName: \"kubernetes.io/projected/04dae116-ceca-4588-9cba-1266bfa92caf-kube-api-access-2rkdq\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.532580 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.532592 4886 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.532603 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04dae116-ceca-4588-9cba-1266bfa92caf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.861943 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerStarted","Data":"9d8e62602d1305f37f8a51b73f2c104ca86a67a3331fc3d826d42ccf0fac24ce"} Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.866058 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j5gfz" event={"ID":"04dae116-ceca-4588-9cba-1266bfa92caf","Type":"ContainerDied","Data":"3d72bfc601ef7f8aa44a162e8a49bc717daf618d327e886ac546527a7c3a7e17"} Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.866091 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j5gfz" Jan 29 17:07:44 crc kubenswrapper[4886]: I0129 17:07:44.866114 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d72bfc601ef7f8aa44a162e8a49bc717daf618d327e886ac546527a7c3a7e17" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.177036 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:07:45 crc kubenswrapper[4886]: E0129 17:07:45.177927 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04dae116-ceca-4588-9cba-1266bfa92caf" containerName="cinder-db-sync" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.177944 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="04dae116-ceca-4588-9cba-1266bfa92caf" containerName="cinder-db-sync" Jan 29 17:07:45 crc kubenswrapper[4886]: E0129 17:07:45.177985 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0058f32-ae80-4dde-9dce-095c62f45979" containerName="heat-db-sync" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.177993 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0058f32-ae80-4dde-9dce-095c62f45979" containerName="heat-db-sync" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.178241 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0058f32-ae80-4dde-9dce-095c62f45979" containerName="heat-db-sync" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.178259 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="04dae116-ceca-4588-9cba-1266bfa92caf" containerName="cinder-db-sync" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.179735 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.188970 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.189164 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ldtkt" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.189263 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.189396 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.189668 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.260762 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.260811 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.260921 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79744cfd-ecdc-42c4-b70e-bb957640a11c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.260940 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrzlm\" (UniqueName: \"kubernetes.io/projected/79744cfd-ecdc-42c4-b70e-bb957640a11c-kube-api-access-zrzlm\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.261011 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-scripts\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.261028 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.332760 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-jsg5q"] Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.332980 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" podUID="9ac97bdb-475a-4061-96b0-1423be10bb5b" containerName="dnsmasq-dns" containerID="cri-o://a528683376327e5804a4ea1ec553e70518415fe775e3feb358ab1099f935a1fb" gracePeriod=10 Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.340043 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.369731 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.369777 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.369882 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79744cfd-ecdc-42c4-b70e-bb957640a11c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.369900 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrzlm\" (UniqueName: \"kubernetes.io/projected/79744cfd-ecdc-42c4-b70e-bb957640a11c-kube-api-access-zrzlm\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.369964 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-scripts\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.369980 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.372423 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79744cfd-ecdc-42c4-b70e-bb957640a11c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.383053 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.396972 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-dv5ch"] Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.404902 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.407851 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.408258 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.418972 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrzlm\" (UniqueName: \"kubernetes.io/projected/79744cfd-ecdc-42c4-b70e-bb957640a11c-kube-api-access-zrzlm\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.422683 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-dv5ch"] Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.423915 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-scripts\") pod \"cinder-scheduler-0\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.492882 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-config\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.493002 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.493188 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.493307 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm767\" (UniqueName: \"kubernetes.io/projected/a4e533f1-e8eb-4426-906e-35354266d610-kube-api-access-rm767\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.493350 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.493395 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.513002 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.569376 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.571206 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.629377 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.629864 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.629975 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.629996 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm767\" (UniqueName: \"kubernetes.io/projected/a4e533f1-e8eb-4426-906e-35354266d610-kube-api-access-rm767\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.630041 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.630210 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-config\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.631262 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-config\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.631832 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.632819 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.638489 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.643008 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.650368 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.642052 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.708583 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm767\" (UniqueName: \"kubernetes.io/projected/a4e533f1-e8eb-4426-906e-35354266d610-kube-api-access-rm767\") pod \"dnsmasq-dns-795f4db4bc-dv5ch\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.732846 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.732962 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-scripts\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.732981 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-logs\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.733005 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-769bq\" (UniqueName: \"kubernetes.io/projected/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-kube-api-access-769bq\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.733067 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.733195 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.733212 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data-custom\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.835589 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.835728 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.835729 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.835751 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data-custom\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.836049 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.836213 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-scripts\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.836240 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-logs\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.836312 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-769bq\" (UniqueName: \"kubernetes.io/projected/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-kube-api-access-769bq\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.837088 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-logs\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.849694 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.852082 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-scripts\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.852759 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.867644 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data-custom\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.867791 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-769bq\" (UniqueName: \"kubernetes.io/projected/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-kube-api-access-769bq\") pod \"cinder-api-0\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " pod="openstack/cinder-api-0" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.970378 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:45 crc kubenswrapper[4886]: I0129 17:07:45.976137 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 17:07:46 crc kubenswrapper[4886]: I0129 17:07:46.195317 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:07:46 crc kubenswrapper[4886]: I0129 17:07:46.529744 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-dv5ch"] Jan 29 17:07:46 crc kubenswrapper[4886]: W0129 17:07:46.534253 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4e533f1_e8eb_4426_906e_35354266d610.slice/crio-9b0b0e72dbfa9a690950a4cb5f65710c32c08a1c18a1d00cb2ec594ac0b3c616 WatchSource:0}: Error finding container 9b0b0e72dbfa9a690950a4cb5f65710c32c08a1c18a1d00cb2ec594ac0b3c616: Status 404 returned error can't find the container with id 9b0b0e72dbfa9a690950a4cb5f65710c32c08a1c18a1d00cb2ec594ac0b3c616 Jan 29 17:07:46 crc kubenswrapper[4886]: I0129 17:07:46.725476 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:46 crc kubenswrapper[4886]: I0129 17:07:46.917596 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2","Type":"ContainerStarted","Data":"281f6c4ddc5b493d23b42767bfd856396f39345c359337095a814d651f657b39"} Jan 29 17:07:46 crc kubenswrapper[4886]: I0129 17:07:46.938513 4886 generic.go:334] "Generic (PLEG): container finished" podID="9ac97bdb-475a-4061-96b0-1423be10bb5b" containerID="a528683376327e5804a4ea1ec553e70518415fe775e3feb358ab1099f935a1fb" exitCode=0 Jan 29 17:07:46 crc kubenswrapper[4886]: I0129 17:07:46.938581 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" event={"ID":"9ac97bdb-475a-4061-96b0-1423be10bb5b","Type":"ContainerDied","Data":"a528683376327e5804a4ea1ec553e70518415fe775e3feb358ab1099f935a1fb"} Jan 29 17:07:46 crc kubenswrapper[4886]: I0129 17:07:46.945684 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" event={"ID":"a4e533f1-e8eb-4426-906e-35354266d610","Type":"ContainerStarted","Data":"9b0b0e72dbfa9a690950a4cb5f65710c32c08a1c18a1d00cb2ec594ac0b3c616"} Jan 29 17:07:46 crc kubenswrapper[4886]: I0129 17:07:46.953343 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79744cfd-ecdc-42c4-b70e-bb957640a11c","Type":"ContainerStarted","Data":"eb5bacab0ef6b5257f3ba5127165c9496314e35a73af62c8e260a0b9866372e0"} Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.277277 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.318394 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.386172 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-config\") pod \"9ac97bdb-475a-4061-96b0-1423be10bb5b\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.386229 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-nb\") pod \"9ac97bdb-475a-4061-96b0-1423be10bb5b\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.386354 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-svc\") pod \"9ac97bdb-475a-4061-96b0-1423be10bb5b\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.386377 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-swift-storage-0\") pod \"9ac97bdb-475a-4061-96b0-1423be10bb5b\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.386554 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-sb\") pod \"9ac97bdb-475a-4061-96b0-1423be10bb5b\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.386611 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzzbl\" (UniqueName: \"kubernetes.io/projected/9ac97bdb-475a-4061-96b0-1423be10bb5b-kube-api-access-tzzbl\") pod \"9ac97bdb-475a-4061-96b0-1423be10bb5b\" (UID: \"9ac97bdb-475a-4061-96b0-1423be10bb5b\") " Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.402344 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac97bdb-475a-4061-96b0-1423be10bb5b-kube-api-access-tzzbl" (OuterVolumeSpecName: "kube-api-access-tzzbl") pod "9ac97bdb-475a-4061-96b0-1423be10bb5b" (UID: "9ac97bdb-475a-4061-96b0-1423be10bb5b"). InnerVolumeSpecName "kube-api-access-tzzbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.464822 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ac97bdb-475a-4061-96b0-1423be10bb5b" (UID: "9ac97bdb-475a-4061-96b0-1423be10bb5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.504971 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzzbl\" (UniqueName: \"kubernetes.io/projected/9ac97bdb-475a-4061-96b0-1423be10bb5b-kube-api-access-tzzbl\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.505217 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.558731 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-config" (OuterVolumeSpecName: "config") pod "9ac97bdb-475a-4061-96b0-1423be10bb5b" (UID: "9ac97bdb-475a-4061-96b0-1423be10bb5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.578082 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ac97bdb-475a-4061-96b0-1423be10bb5b" (UID: "9ac97bdb-475a-4061-96b0-1423be10bb5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.598027 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ac97bdb-475a-4061-96b0-1423be10bb5b" (UID: "9ac97bdb-475a-4061-96b0-1423be10bb5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.615410 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.615443 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.615455 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.625065 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ac97bdb-475a-4061-96b0-1423be10bb5b" (UID: "9ac97bdb-475a-4061-96b0-1423be10bb5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.717928 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac97bdb-475a-4061-96b0-1423be10bb5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.968678 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2","Type":"ContainerStarted","Data":"b8d0ea03cf6cf69b26bdf55d5de8b0049bbdd593eaf6801f03f5d5761e184e45"} Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.971040 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.971051 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-jsg5q" event={"ID":"9ac97bdb-475a-4061-96b0-1423be10bb5b","Type":"ContainerDied","Data":"1724f7bc6805ebdf2ea8515900b97a42430de51ca57fd28deec62f818f0909c2"} Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.971278 4886 scope.go:117] "RemoveContainer" containerID="a528683376327e5804a4ea1ec553e70518415fe775e3feb358ab1099f935a1fb" Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.980761 4886 generic.go:334] "Generic (PLEG): container finished" podID="a4e533f1-e8eb-4426-906e-35354266d610" containerID="2012816a934b66e60ffd90c59e1fa261b396b239468adba78a0dedfe4395c1be" exitCode=0 Jan 29 17:07:47 crc kubenswrapper[4886]: I0129 17:07:47.980814 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" event={"ID":"a4e533f1-e8eb-4426-906e-35354266d610","Type":"ContainerDied","Data":"2012816a934b66e60ffd90c59e1fa261b396b239468adba78a0dedfe4395c1be"} Jan 29 17:07:48 crc kubenswrapper[4886]: I0129 17:07:48.030366 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-jsg5q"] Jan 29 17:07:48 crc kubenswrapper[4886]: I0129 17:07:48.040114 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-jsg5q"] Jan 29 17:07:48 crc kubenswrapper[4886]: I0129 17:07:48.346035 4886 scope.go:117] "RemoveContainer" containerID="d6011c232b01e3892826684cea65e05a2b5a15c43a2d859d545b9c20ac294a14" Jan 29 17:07:48 crc kubenswrapper[4886]: I0129 17:07:48.492850 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:48 crc kubenswrapper[4886]: I0129 17:07:48.513536 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:48 crc kubenswrapper[4886]: I0129 17:07:48.652465 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac97bdb-475a-4061-96b0-1423be10bb5b" path="/var/lib/kubelet/pods/9ac97bdb-475a-4061-96b0-1423be10bb5b/volumes" Jan 29 17:07:49 crc kubenswrapper[4886]: I0129 17:07:49.002693 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5499bdc9-q6hr4" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.107312 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 29 17:07:50 crc kubenswrapper[4886]: E0129 17:07:50.120176 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac97bdb-475a-4061-96b0-1423be10bb5b" containerName="init" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.120193 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac97bdb-475a-4061-96b0-1423be10bb5b" containerName="init" Jan 29 17:07:50 crc kubenswrapper[4886]: E0129 17:07:50.120229 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac97bdb-475a-4061-96b0-1423be10bb5b" containerName="dnsmasq-dns" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.120235 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac97bdb-475a-4061-96b0-1423be10bb5b" containerName="dnsmasq-dns" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.120476 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac97bdb-475a-4061-96b0-1423be10bb5b" containerName="dnsmasq-dns" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.121285 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.127274 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-jq45j" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.127529 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.128442 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.167428 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79744cfd-ecdc-42c4-b70e-bb957640a11c","Type":"ContainerStarted","Data":"dd01b92d286ab63ee03bff172b9b03aa69d2a7db780bc4a7761f9cf8e7790134"} Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.181965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2","Type":"ContainerStarted","Data":"427b1632fa7330e8e999fa999675e7326ae042f6f381126c9b2276f118bf9b8f"} Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.182212 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerName="cinder-api-log" containerID="cri-o://b8d0ea03cf6cf69b26bdf55d5de8b0049bbdd593eaf6801f03f5d5761e184e45" gracePeriod=30 Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.182310 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.182496 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerName="cinder-api" containerID="cri-o://427b1632fa7330e8e999fa999675e7326ae042f6f381126c9b2276f118bf9b8f" gracePeriod=30 Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.199266 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.218669 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be43aab6-3888-4260-a85c-147e2ae0a36d-openstack-config-secret\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.218713 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be43aab6-3888-4260-a85c-147e2ae0a36d-openstack-config\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.218737 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be43aab6-3888-4260-a85c-147e2ae0a36d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.218817 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4t4b\" (UniqueName: \"kubernetes.io/projected/be43aab6-3888-4260-a85c-147e2ae0a36d-kube-api-access-l4t4b\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.225567 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" event={"ID":"a4e533f1-e8eb-4426-906e-35354266d610","Type":"ContainerStarted","Data":"bfb4e65e7631317b75e0b15c39b90031add550dcb40292d0be47c6410cfdc89e"} Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.226936 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.250549 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.25052849 podStartE2EDuration="5.25052849s" podCreationTimestamp="2026-01-29 17:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:50.213887938 +0000 UTC m=+2753.122607210" watchObservedRunningTime="2026-01-29 17:07:50.25052849 +0000 UTC m=+2753.159247762" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.288686 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" podStartSLOduration=5.288565661 podStartE2EDuration="5.288565661s" podCreationTimestamp="2026-01-29 17:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:50.245275182 +0000 UTC m=+2753.153994454" watchObservedRunningTime="2026-01-29 17:07:50.288565661 +0000 UTC m=+2753.197284933" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.323635 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be43aab6-3888-4260-a85c-147e2ae0a36d-openstack-config-secret\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.323675 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be43aab6-3888-4260-a85c-147e2ae0a36d-openstack-config\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.323705 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be43aab6-3888-4260-a85c-147e2ae0a36d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.323800 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4t4b\" (UniqueName: \"kubernetes.io/projected/be43aab6-3888-4260-a85c-147e2ae0a36d-kube-api-access-l4t4b\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.325490 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be43aab6-3888-4260-a85c-147e2ae0a36d-openstack-config\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.336019 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be43aab6-3888-4260-a85c-147e2ae0a36d-openstack-config-secret\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.346001 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be43aab6-3888-4260-a85c-147e2ae0a36d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.358901 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4t4b\" (UniqueName: \"kubernetes.io/projected/be43aab6-3888-4260-a85c-147e2ae0a36d-kube-api-access-l4t4b\") pod \"openstackclient\" (UID: \"be43aab6-3888-4260-a85c-147e2ae0a36d\") " pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.482137 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.844177 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:50 crc kubenswrapper[4886]: I0129 17:07:50.846885 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-795d8c76d8-x2zqv" Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.253456 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79744cfd-ecdc-42c4-b70e-bb957640a11c","Type":"ContainerStarted","Data":"3d38ab3f39b8f10e80b68dcbf56b94dd2483224e667fea1a1a75ada7c0ecf901"} Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.259829 4886 generic.go:334] "Generic (PLEG): container finished" podID="43da0665-7e6a-4176-ae84-71128a89a243" containerID="c4ce1f7996acaa4140e3f499ede2bc0c80a3f2eb7c1df999e0b4f5903e1d75cf" exitCode=0 Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.259878 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qglhp" event={"ID":"43da0665-7e6a-4176-ae84-71128a89a243","Type":"ContainerDied","Data":"c4ce1f7996acaa4140e3f499ede2bc0c80a3f2eb7c1df999e0b4f5903e1d75cf"} Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.288711 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.194423043 podStartE2EDuration="6.288692071s" podCreationTimestamp="2026-01-29 17:07:45 +0000 UTC" firstStartedPulling="2026-01-29 17:07:46.253077438 +0000 UTC m=+2749.161796710" lastFinishedPulling="2026-01-29 17:07:48.347346456 +0000 UTC m=+2751.256065738" observedRunningTime="2026-01-29 17:07:51.276255941 +0000 UTC m=+2754.184975233" watchObservedRunningTime="2026-01-29 17:07:51.288692071 +0000 UTC m=+2754.197411343" Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.301123 4886 generic.go:334] "Generic (PLEG): container finished" podID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerID="427b1632fa7330e8e999fa999675e7326ae042f6f381126c9b2276f118bf9b8f" exitCode=0 Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.301150 4886 generic.go:334] "Generic (PLEG): container finished" podID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerID="b8d0ea03cf6cf69b26bdf55d5de8b0049bbdd593eaf6801f03f5d5761e184e45" exitCode=143 Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.302174 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2","Type":"ContainerDied","Data":"427b1632fa7330e8e999fa999675e7326ae042f6f381126c9b2276f118bf9b8f"} Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.302199 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2","Type":"ContainerDied","Data":"b8d0ea03cf6cf69b26bdf55d5de8b0049bbdd593eaf6801f03f5d5761e184e45"} Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.391682 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.714417 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 17:07:51 crc kubenswrapper[4886]: W0129 17:07:51.720256 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe43aab6_3888_4260_a85c_147e2ae0a36d.slice/crio-f55162e34c8bb62e3e1744e4e6436f51562cbf1a2bd6ce27de003f68256e0764 WatchSource:0}: Error finding container f55162e34c8bb62e3e1744e4e6436f51562cbf1a2bd6ce27de003f68256e0764: Status 404 returned error can't find the container with id f55162e34c8bb62e3e1744e4e6436f51562cbf1a2bd6ce27de003f68256e0764 Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.826362 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fb894ff6d-w7s26" Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.965084 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55f7ff7dd6-jj4jw"] Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.965510 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api-log" containerID="cri-o://f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5" gracePeriod=30 Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.965731 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api" containerID="cri-o://8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2" gracePeriod=30 Jan 29 17:07:51 crc kubenswrapper[4886]: I0129 17:07:51.989548 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.099966 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-logs\") pod \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.107799 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data-custom\") pod \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.107932 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-combined-ca-bundle\") pod \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.108023 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-scripts\") pod \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.108138 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data\") pod \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.108271 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-etc-machine-id\") pod \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.108398 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-769bq\" (UniqueName: \"kubernetes.io/projected/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-kube-api-access-769bq\") pod \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\" (UID: \"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2\") " Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.100788 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-logs" (OuterVolumeSpecName: "logs") pod "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" (UID: "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.114573 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" (UID: "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.120744 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-scripts" (OuterVolumeSpecName: "scripts") pod "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" (UID: "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.143300 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" (UID: "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.150548 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-kube-api-access-769bq" (OuterVolumeSpecName: "kube-api-access-769bq") pod "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" (UID: "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2"). InnerVolumeSpecName "kube-api-access-769bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.210637 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-769bq\" (UniqueName: \"kubernetes.io/projected/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-kube-api-access-769bq\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.210668 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.210679 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.210687 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.210696 4886 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.283481 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" (UID: "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.314080 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.344171 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc58d1b4-0d5e-4768-9a82-b6bbcca420a2","Type":"ContainerDied","Data":"281f6c4ddc5b493d23b42767bfd856396f39345c359337095a814d651f657b39"} Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.344222 4886 scope.go:117] "RemoveContainer" containerID="427b1632fa7330e8e999fa999675e7326ae042f6f381126c9b2276f118bf9b8f" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.344218 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.365760 4886 generic.go:334] "Generic (PLEG): container finished" podID="ea36feff-2438-49e4-b779-0b083addd0a8" containerID="f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5" exitCode=143 Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.365870 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" event={"ID":"ea36feff-2438-49e4-b779-0b083addd0a8","Type":"ContainerDied","Data":"f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5"} Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.370068 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"be43aab6-3888-4260-a85c-147e2ae0a36d","Type":"ContainerStarted","Data":"f55162e34c8bb62e3e1744e4e6436f51562cbf1a2bd6ce27de003f68256e0764"} Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.391825 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data" (OuterVolumeSpecName: "config-data") pod "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" (UID: "cc58d1b4-0d5e-4768-9a82-b6bbcca420a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.406717 4886 scope.go:117] "RemoveContainer" containerID="b8d0ea03cf6cf69b26bdf55d5de8b0049bbdd593eaf6801f03f5d5761e184e45" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.421766 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.733746 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.766396 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.777701 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:52 crc kubenswrapper[4886]: E0129 17:07:52.778382 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerName="cinder-api-log" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.778469 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerName="cinder-api-log" Jan 29 17:07:52 crc kubenswrapper[4886]: E0129 17:07:52.778540 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerName="cinder-api" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.778592 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerName="cinder-api" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.778865 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerName="cinder-api" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.778949 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" containerName="cinder-api-log" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.780174 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.783611 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.783786 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.784510 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.790912 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.828009 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-config-data-custom\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.828910 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3573eaa4-4c27-4747-a691-15ae61d152f3-logs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.828975 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.829039 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.829058 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4j4v\" (UniqueName: \"kubernetes.io/projected/3573eaa4-4c27-4747-a691-15ae61d152f3-kube-api-access-v4j4v\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.829079 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-config-data\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.829153 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.829220 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-scripts\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.829247 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3573eaa4-4c27-4747-a691-15ae61d152f3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.928709 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qglhp" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.931235 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-config-data-custom\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.931508 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3573eaa4-4c27-4747-a691-15ae61d152f3-logs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.931644 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.931780 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.931914 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4j4v\" (UniqueName: \"kubernetes.io/projected/3573eaa4-4c27-4747-a691-15ae61d152f3-kube-api-access-v4j4v\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.932032 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-config-data\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.932214 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.932380 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-scripts\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.932510 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3573eaa4-4c27-4747-a691-15ae61d152f3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.932781 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3573eaa4-4c27-4747-a691-15ae61d152f3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.933507 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3573eaa4-4c27-4747-a691-15ae61d152f3-logs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.947217 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.951374 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-scripts\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.952173 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-config-data-custom\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.952982 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.966029 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4j4v\" (UniqueName: \"kubernetes.io/projected/3573eaa4-4c27-4747-a691-15ae61d152f3-kube-api-access-v4j4v\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.966267 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:52 crc kubenswrapper[4886]: I0129 17:07:52.966924 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3573eaa4-4c27-4747-a691-15ae61d152f3-config-data\") pod \"cinder-api-0\" (UID: \"3573eaa4-4c27-4747-a691-15ae61d152f3\") " pod="openstack/cinder-api-0" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.131909 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.136162 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkvgz\" (UniqueName: \"kubernetes.io/projected/43da0665-7e6a-4176-ae84-71128a89a243-kube-api-access-vkvgz\") pod \"43da0665-7e6a-4176-ae84-71128a89a243\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.136209 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-combined-ca-bundle\") pod \"43da0665-7e6a-4176-ae84-71128a89a243\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.136315 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-config\") pod \"43da0665-7e6a-4176-ae84-71128a89a243\" (UID: \"43da0665-7e6a-4176-ae84-71128a89a243\") " Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.154593 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43da0665-7e6a-4176-ae84-71128a89a243-kube-api-access-vkvgz" (OuterVolumeSpecName: "kube-api-access-vkvgz") pod "43da0665-7e6a-4176-ae84-71128a89a243" (UID: "43da0665-7e6a-4176-ae84-71128a89a243"). InnerVolumeSpecName "kube-api-access-vkvgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.176191 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-config" (OuterVolumeSpecName: "config") pod "43da0665-7e6a-4176-ae84-71128a89a243" (UID: "43da0665-7e6a-4176-ae84-71128a89a243"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.222450 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43da0665-7e6a-4176-ae84-71128a89a243" (UID: "43da0665-7e6a-4176-ae84-71128a89a243"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.239839 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkvgz\" (UniqueName: \"kubernetes.io/projected/43da0665-7e6a-4176-ae84-71128a89a243-kube-api-access-vkvgz\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.240674 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.240692 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/43da0665-7e6a-4176-ae84-71128a89a243-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.391964 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qglhp" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.393580 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qglhp" event={"ID":"43da0665-7e6a-4176-ae84-71128a89a243","Type":"ContainerDied","Data":"466198a6dbe8073f38dde3862e5bfda50e204a4fc5dd98f6c616c1e63cc8d1a0"} Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.393653 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="466198a6dbe8073f38dde3862e5bfda50e204a4fc5dd98f6c616c1e63cc8d1a0" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.409413 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerStarted","Data":"44a3542db94b31c96db714bd6c3559bd3e1d7d7a66d633f86abe33fb9a6f4bd0"} Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.410616 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.438500 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.251302755 podStartE2EDuration="16.438482011s" podCreationTimestamp="2026-01-29 17:07:37 +0000 UTC" firstStartedPulling="2026-01-29 17:07:38.553135205 +0000 UTC m=+2741.461854487" lastFinishedPulling="2026-01-29 17:07:51.740314471 +0000 UTC m=+2754.649033743" observedRunningTime="2026-01-29 17:07:53.436793273 +0000 UTC m=+2756.345512545" watchObservedRunningTime="2026-01-29 17:07:53.438482011 +0000 UTC m=+2756.347201283" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.523948 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-dv5ch"] Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.524171 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" podUID="a4e533f1-e8eb-4426-906e-35354266d610" containerName="dnsmasq-dns" containerID="cri-o://bfb4e65e7631317b75e0b15c39b90031add550dcb40292d0be47c6410cfdc89e" gracePeriod=10 Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.586585 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lbcqc"] Jan 29 17:07:53 crc kubenswrapper[4886]: E0129 17:07:53.618408 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43da0665-7e6a-4176-ae84-71128a89a243" containerName="neutron-db-sync" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.618449 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="43da0665-7e6a-4176-ae84-71128a89a243" containerName="neutron-db-sync" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.649639 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="43da0665-7e6a-4176-ae84-71128a89a243" containerName="neutron-db-sync" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.660178 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.680372 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lbcqc"] Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.732567 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7854df7c4b-dn4j7"] Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.747734 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.751643 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wvjgr" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.751890 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.756457 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7854df7c4b-dn4j7"] Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.760576 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.764379 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.766260 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.766346 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rwhj\" (UniqueName: \"kubernetes.io/projected/77e77908-f078-4711-8c40-5e0bbda2a830-kube-api-access-6rwhj\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.766397 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.766415 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.766490 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-config\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.766510 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.807639 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870049 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-combined-ca-bundle\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870141 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870183 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rwhj\" (UniqueName: \"kubernetes.io/projected/77e77908-f078-4711-8c40-5e0bbda2a830-kube-api-access-6rwhj\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870212 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870230 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870254 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-ovndb-tls-certs\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870314 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-config\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870364 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870384 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-httpd-config\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870439 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-config\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.870487 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhjq8\" (UniqueName: \"kubernetes.io/projected/0ff8b641-0d76-41ce-b6ac-7d708effebc0-kube-api-access-nhjq8\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.871300 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.872080 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.874598 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.874946 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-config\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.875786 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.897758 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rwhj\" (UniqueName: \"kubernetes.io/projected/77e77908-f078-4711-8c40-5e0bbda2a830-kube-api-access-6rwhj\") pod \"dnsmasq-dns-5c9776ccc5-lbcqc\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.973002 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-combined-ca-bundle\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.973420 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-ovndb-tls-certs\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.973493 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-httpd-config\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.973546 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-config\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.973582 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhjq8\" (UniqueName: \"kubernetes.io/projected/0ff8b641-0d76-41ce-b6ac-7d708effebc0-kube-api-access-nhjq8\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:53 crc kubenswrapper[4886]: I0129 17:07:53.998604 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-ovndb-tls-certs\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.003171 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-httpd-config\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.014348 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-combined-ca-bundle\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.014473 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-config\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.022961 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhjq8\" (UniqueName: \"kubernetes.io/projected/0ff8b641-0d76-41ce-b6ac-7d708effebc0-kube-api-access-nhjq8\") pod \"neutron-7854df7c4b-dn4j7\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.042873 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.089485 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.442547 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3573eaa4-4c27-4747-a691-15ae61d152f3","Type":"ContainerStarted","Data":"1385108d3e83430f45d60172a4f29a52c80dc5f81117e1d2b4da4da320eaf2a2"} Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.479842 4886 generic.go:334] "Generic (PLEG): container finished" podID="a4e533f1-e8eb-4426-906e-35354266d610" containerID="bfb4e65e7631317b75e0b15c39b90031add550dcb40292d0be47c6410cfdc89e" exitCode=0 Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.481104 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" event={"ID":"a4e533f1-e8eb-4426-906e-35354266d610","Type":"ContainerDied","Data":"bfb4e65e7631317b75e0b15c39b90031add550dcb40292d0be47c6410cfdc89e"} Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.481130 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" event={"ID":"a4e533f1-e8eb-4426-906e-35354266d610","Type":"ContainerDied","Data":"9b0b0e72dbfa9a690950a4cb5f65710c32c08a1c18a1d00cb2ec594ac0b3c616"} Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.481142 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b0b0e72dbfa9a690950a4cb5f65710c32c08a1c18a1d00cb2ec594ac0b3c616" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.496544 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.595453 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-svc\") pod \"a4e533f1-e8eb-4426-906e-35354266d610\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.597305 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm767\" (UniqueName: \"kubernetes.io/projected/a4e533f1-e8eb-4426-906e-35354266d610-kube-api-access-rm767\") pod \"a4e533f1-e8eb-4426-906e-35354266d610\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.597825 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-config\") pod \"a4e533f1-e8eb-4426-906e-35354266d610\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.597926 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-nb\") pod \"a4e533f1-e8eb-4426-906e-35354266d610\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.597986 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-sb\") pod \"a4e533f1-e8eb-4426-906e-35354266d610\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.598099 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-swift-storage-0\") pod \"a4e533f1-e8eb-4426-906e-35354266d610\" (UID: \"a4e533f1-e8eb-4426-906e-35354266d610\") " Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.606675 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e533f1-e8eb-4426-906e-35354266d610-kube-api-access-rm767" (OuterVolumeSpecName: "kube-api-access-rm767") pod "a4e533f1-e8eb-4426-906e-35354266d610" (UID: "a4e533f1-e8eb-4426-906e-35354266d610"). InnerVolumeSpecName "kube-api-access-rm767". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.649601 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc58d1b4-0d5e-4768-9a82-b6bbcca420a2" path="/var/lib/kubelet/pods/cc58d1b4-0d5e-4768-9a82-b6bbcca420a2/volumes" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.687220 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-config" (OuterVolumeSpecName: "config") pod "a4e533f1-e8eb-4426-906e-35354266d610" (UID: "a4e533f1-e8eb-4426-906e-35354266d610"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.687650 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4e533f1-e8eb-4426-906e-35354266d610" (UID: "a4e533f1-e8eb-4426-906e-35354266d610"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.719544 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.719576 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm767\" (UniqueName: \"kubernetes.io/projected/a4e533f1-e8eb-4426-906e-35354266d610-kube-api-access-rm767\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.719589 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.719878 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4e533f1-e8eb-4426-906e-35354266d610" (UID: "a4e533f1-e8eb-4426-906e-35354266d610"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.759197 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4e533f1-e8eb-4426-906e-35354266d610" (UID: "a4e533f1-e8eb-4426-906e-35354266d610"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.825428 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.825461 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.835394 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4e533f1-e8eb-4426-906e-35354266d610" (UID: "a4e533f1-e8eb-4426-906e-35354266d610"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.848646 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lbcqc"] Jan 29 17:07:54 crc kubenswrapper[4886]: I0129 17:07:54.935921 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4e533f1-e8eb-4426-906e-35354266d610-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.154600 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7854df7c4b-dn4j7"] Jan 29 17:07:55 crc kubenswrapper[4886]: W0129 17:07:55.173522 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ff8b641_0d76_41ce_b6ac_7d708effebc0.slice/crio-e7a3e9e15910d73e70e0b6e954b7743de9f55b25dd0f0bfd34c348eb738633d2 WatchSource:0}: Error finding container e7a3e9e15910d73e70e0b6e954b7743de9f55b25dd0f0bfd34c348eb738633d2: Status 404 returned error can't find the container with id e7a3e9e15910d73e70e0b6e954b7743de9f55b25dd0f0bfd34c348eb738633d2 Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.504677 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3573eaa4-4c27-4747-a691-15ae61d152f3","Type":"ContainerStarted","Data":"53e60943629db0c2467c81d05149376435438eedc3af65a98b6e31b78f97981c"} Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.513786 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.514396 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7854df7c4b-dn4j7" event={"ID":"0ff8b641-0d76-41ce-b6ac-7d708effebc0","Type":"ContainerStarted","Data":"75e8cf0cad7d6d59d88f3f3bd6a97cab33d3691af01126d62cdae48b3d82240f"} Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.514444 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7854df7c4b-dn4j7" event={"ID":"0ff8b641-0d76-41ce-b6ac-7d708effebc0","Type":"ContainerStarted","Data":"e7a3e9e15910d73e70e0b6e954b7743de9f55b25dd0f0bfd34c348eb738633d2"} Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.521503 4886 generic.go:334] "Generic (PLEG): container finished" podID="77e77908-f078-4711-8c40-5e0bbda2a830" containerID="c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a" exitCode=0 Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.523264 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" event={"ID":"77e77908-f078-4711-8c40-5e0bbda2a830","Type":"ContainerDied","Data":"c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a"} Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.523302 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" event={"ID":"77e77908-f078-4711-8c40-5e0bbda2a830","Type":"ContainerStarted","Data":"00c8741e78cdef06ac95516aebc006fef061abb10bc976627d894974f2fc0223"} Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.523360 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-dv5ch" Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.587389 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-dv5ch"] Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.597378 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-dv5ch"] Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.743664 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.215:9311/healthcheck\": read tcp 10.217.0.2:52532->10.217.0.215:9311: read: connection reset by peer" Jan 29 17:07:55 crc kubenswrapper[4886]: I0129 17:07:55.744010 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.215:9311/healthcheck\": read tcp 10.217.0.2:52540->10.217.0.215:9311: read: connection reset by peer" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.002611 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-846d49f49c-kc98b"] Jan 29 17:07:56 crc kubenswrapper[4886]: E0129 17:07:56.003455 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e533f1-e8eb-4426-906e-35354266d610" containerName="dnsmasq-dns" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.003474 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e533f1-e8eb-4426-906e-35354266d610" containerName="dnsmasq-dns" Jan 29 17:07:56 crc kubenswrapper[4886]: E0129 17:07:56.003547 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e533f1-e8eb-4426-906e-35354266d610" containerName="init" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.003556 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e533f1-e8eb-4426-906e-35354266d610" containerName="init" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.004069 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e533f1-e8eb-4426-906e-35354266d610" containerName="dnsmasq-dns" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.005463 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.011871 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.012086 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.029391 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-846d49f49c-kc98b"] Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.106720 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-config\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.107587 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-internal-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.107675 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-combined-ca-bundle\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.107703 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-public-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.107809 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5gnk\" (UniqueName: \"kubernetes.io/projected/344feff6-8139-425e-b7dc-f35fe5b17247-kube-api-access-x5gnk\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.107838 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-httpd-config\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.107886 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-ovndb-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.178545 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.220339 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-internal-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.220522 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-combined-ca-bundle\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.220584 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-public-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.221627 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5gnk\" (UniqueName: \"kubernetes.io/projected/344feff6-8139-425e-b7dc-f35fe5b17247-kube-api-access-x5gnk\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.221671 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-httpd-config\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.221743 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-ovndb-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.221885 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-config\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.277423 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-config\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.281143 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-combined-ca-bundle\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.282020 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-httpd-config\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.284953 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-public-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.285039 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-internal-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.285350 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/344feff6-8139-425e-b7dc-f35fe5b17247-ovndb-tls-certs\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.289303 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5gnk\" (UniqueName: \"kubernetes.io/projected/344feff6-8139-425e-b7dc-f35fe5b17247-kube-api-access-x5gnk\") pod \"neutron-846d49f49c-kc98b\" (UID: \"344feff6-8139-425e-b7dc-f35fe5b17247\") " pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.345285 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.593105 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" event={"ID":"77e77908-f078-4711-8c40-5e0bbda2a830","Type":"ContainerStarted","Data":"53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8"} Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.593595 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.600636 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7854df7c4b-dn4j7" event={"ID":"0ff8b641-0d76-41ce-b6ac-7d708effebc0","Type":"ContainerStarted","Data":"f3ee0a56aaca61cef2419de911db690ccd8876c78a545e2b8864e16aa4ff333a"} Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.600960 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.601120 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.614292 4886 generic.go:334] "Generic (PLEG): container finished" podID="ea36feff-2438-49e4-b779-0b083addd0a8" containerID="8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2" exitCode=0 Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.615436 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" event={"ID":"ea36feff-2438-49e4-b779-0b083addd0a8","Type":"ContainerDied","Data":"8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2"} Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.615491 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" event={"ID":"ea36feff-2438-49e4-b779-0b083addd0a8","Type":"ContainerDied","Data":"e9dafe9a7a14455f6d6567489f608749fce9a0af4812468a1f99388ab4f30929"} Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.615512 4886 scope.go:117] "RemoveContainer" containerID="8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.621754 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" podStartSLOduration=3.621729079 podStartE2EDuration="3.621729079s" podCreationTimestamp="2026-01-29 17:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:56.613837836 +0000 UTC m=+2759.522557108" watchObservedRunningTime="2026-01-29 17:07:56.621729079 +0000 UTC m=+2759.530448351" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.635451 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea36feff-2438-49e4-b779-0b083addd0a8-logs\") pod \"ea36feff-2438-49e4-b779-0b083addd0a8\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.635576 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25kn5\" (UniqueName: \"kubernetes.io/projected/ea36feff-2438-49e4-b779-0b083addd0a8-kube-api-access-25kn5\") pod \"ea36feff-2438-49e4-b779-0b083addd0a8\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.635682 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-combined-ca-bundle\") pod \"ea36feff-2438-49e4-b779-0b083addd0a8\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.635748 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4e533f1-e8eb-4426-906e-35354266d610" path="/var/lib/kubelet/pods/a4e533f1-e8eb-4426-906e-35354266d610/volumes" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.635989 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data\") pod \"ea36feff-2438-49e4-b779-0b083addd0a8\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.635995 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea36feff-2438-49e4-b779-0b083addd0a8-logs" (OuterVolumeSpecName: "logs") pod "ea36feff-2438-49e4-b779-0b083addd0a8" (UID: "ea36feff-2438-49e4-b779-0b083addd0a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.636013 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data-custom\") pod \"ea36feff-2438-49e4-b779-0b083addd0a8\" (UID: \"ea36feff-2438-49e4-b779-0b083addd0a8\") " Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.643239 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea36feff-2438-49e4-b779-0b083addd0a8-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.649509 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ea36feff-2438-49e4-b779-0b083addd0a8" (UID: "ea36feff-2438-49e4-b779-0b083addd0a8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.652263 4886 scope.go:117] "RemoveContainer" containerID="f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.652500 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea36feff-2438-49e4-b779-0b083addd0a8-kube-api-access-25kn5" (OuterVolumeSpecName: "kube-api-access-25kn5") pod "ea36feff-2438-49e4-b779-0b083addd0a8" (UID: "ea36feff-2438-49e4-b779-0b083addd0a8"). InnerVolumeSpecName "kube-api-access-25kn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.694844 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7854df7c4b-dn4j7" podStartSLOduration=3.694822057 podStartE2EDuration="3.694822057s" podCreationTimestamp="2026-01-29 17:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:56.666820689 +0000 UTC m=+2759.575539961" watchObservedRunningTime="2026-01-29 17:07:56.694822057 +0000 UTC m=+2759.603541329" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.712973 4886 scope.go:117] "RemoveContainer" containerID="8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2" Jan 29 17:07:56 crc kubenswrapper[4886]: E0129 17:07:56.713417 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2\": container with ID starting with 8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2 not found: ID does not exist" containerID="8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.713454 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2"} err="failed to get container status \"8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2\": rpc error: code = NotFound desc = could not find container \"8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2\": container with ID starting with 8bc4314631c2d889fe7693108f39c4873628c917868bfba6190057b2b09695e2 not found: ID does not exist" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.713480 4886 scope.go:117] "RemoveContainer" containerID="f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5" Jan 29 17:07:56 crc kubenswrapper[4886]: E0129 17:07:56.713672 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5\": container with ID starting with f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5 not found: ID does not exist" containerID="f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.713701 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5"} err="failed to get container status \"f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5\": rpc error: code = NotFound desc = could not find container \"f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5\": container with ID starting with f23c7cc8a8209a15c4be1f866071e7d19219ea178dc6b2496da6cf2510dacfc5 not found: ID does not exist" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.726673 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea36feff-2438-49e4-b779-0b083addd0a8" (UID: "ea36feff-2438-49e4-b779-0b083addd0a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.751658 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.751687 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25kn5\" (UniqueName: \"kubernetes.io/projected/ea36feff-2438-49e4-b779-0b083addd0a8-kube-api-access-25kn5\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.751700 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.764807 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.787436 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data" (OuterVolumeSpecName: "config-data") pod "ea36feff-2438-49e4-b779-0b083addd0a8" (UID: "ea36feff-2438-49e4-b779-0b083addd0a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:07:56 crc kubenswrapper[4886]: I0129 17:07:56.861860 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea36feff-2438-49e4-b779-0b083addd0a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.136030 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-846d49f49c-kc98b"] Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.649814 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3573eaa4-4c27-4747-a691-15ae61d152f3","Type":"ContainerStarted","Data":"8c944ebd33123a646892f458423259ef498c1ed94b2d49c157cf74c9b8b08797"} Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.650425 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.653404 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7ff7dd6-jj4jw" Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.656883 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerName="cinder-scheduler" containerID="cri-o://dd01b92d286ab63ee03bff172b9b03aa69d2a7db780bc4a7761f9cf8e7790134" gracePeriod=30 Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.657031 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846d49f49c-kc98b" event={"ID":"344feff6-8139-425e-b7dc-f35fe5b17247","Type":"ContainerStarted","Data":"c1a97bc78fc175b0c5ad8818956524b324cc6770550b8a275346ef7d541fd8eb"} Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.657104 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846d49f49c-kc98b" event={"ID":"344feff6-8139-425e-b7dc-f35fe5b17247","Type":"ContainerStarted","Data":"1e02866a9505d80313275c6450c11906a9e90c56b2c3f33739805d8c22dbd4ce"} Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.657121 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846d49f49c-kc98b" event={"ID":"344feff6-8139-425e-b7dc-f35fe5b17247","Type":"ContainerStarted","Data":"370631686ddc7aba98f4a6b4634378cfb8acd9271e8e45abb240c119128e8252"} Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.657246 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerName="probe" containerID="cri-o://3d38ab3f39b8f10e80b68dcbf56b94dd2483224e667fea1a1a75ada7c0ecf901" gracePeriod=30 Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.657707 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.679593 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.679566964 podStartE2EDuration="5.679566964s" podCreationTimestamp="2026-01-29 17:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:57.669917672 +0000 UTC m=+2760.578636944" watchObservedRunningTime="2026-01-29 17:07:57.679566964 +0000 UTC m=+2760.588286236" Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.702446 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-846d49f49c-kc98b" podStartSLOduration=2.702429038 podStartE2EDuration="2.702429038s" podCreationTimestamp="2026-01-29 17:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:07:57.700867884 +0000 UTC m=+2760.609587156" watchObservedRunningTime="2026-01-29 17:07:57.702429038 +0000 UTC m=+2760.611148330" Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.725598 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55f7ff7dd6-jj4jw"] Jan 29 17:07:57 crc kubenswrapper[4886]: I0129 17:07:57.822192 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-55f7ff7dd6-jj4jw"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.347894 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-54f8bbfbf-9qjxm"] Jan 29 17:07:58 crc kubenswrapper[4886]: E0129 17:07:58.348588 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api-log" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.348605 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api-log" Jan 29 17:07:58 crc kubenswrapper[4886]: E0129 17:07:58.348635 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.348642 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.350099 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.350143 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" containerName="barbican-api-log" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.351002 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.357300 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.357582 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.357748 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-658st" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.375066 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54f8bbfbf-9qjxm"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.411770 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data-custom\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.411858 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4rg\" (UniqueName: \"kubernetes.io/projected/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-kube-api-access-bn4rg\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.411877 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-combined-ca-bundle\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.411916 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.511782 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lbcqc"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.514105 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data-custom\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.514190 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4rg\" (UniqueName: \"kubernetes.io/projected/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-kube-api-access-bn4rg\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.514208 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-combined-ca-bundle\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.514244 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.529221 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data-custom\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.548643 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.549024 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-combined-ca-bundle\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.554372 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-btn45"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.564157 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.574838 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4rg\" (UniqueName: \"kubernetes.io/projected/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-kube-api-access-bn4rg\") pod \"heat-engine-54f8bbfbf-9qjxm\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.581661 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6f6c4bddd6-xqtdm"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.583112 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.596759 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.622855 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-557f889856-kwzsw"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.639282 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-btn45"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.641234 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.650146 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.698141 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea36feff-2438-49e4-b779-0b083addd0a8" path="/var/lib/kubelet/pods/ea36feff-2438-49e4-b779-0b083addd0a8/volumes" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.699211 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6f6c4bddd6-xqtdm"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.699241 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-658st" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.706458 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" podUID="77e77908-f078-4711-8c40-5e0bbda2a830" containerName="dnsmasq-dns" containerID="cri-o://53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8" gracePeriod=10 Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.707122 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733380 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733433 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6k7c\" (UniqueName: \"kubernetes.io/projected/da76d93d-7c2d-485e-b5e0-229f4254d74b-kube-api-access-m6k7c\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733457 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-config\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733519 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733567 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733581 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733606 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-combined-ca-bundle\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733824 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733897 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data-custom\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733915 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733934 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data-custom\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.733960 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhn24\" (UniqueName: \"kubernetes.io/projected/3fa8d357-cef3-43d1-8338-386d9880bb82-kube-api-access-xhn24\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.734005 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khr6q\" (UniqueName: \"kubernetes.io/projected/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-kube-api-access-khr6q\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.734037 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-combined-ca-bundle\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.780938 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-557f889856-kwzsw"] Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836038 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data-custom\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836075 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836104 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data-custom\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836130 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhn24\" (UniqueName: \"kubernetes.io/projected/3fa8d357-cef3-43d1-8338-386d9880bb82-kube-api-access-xhn24\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836167 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khr6q\" (UniqueName: \"kubernetes.io/projected/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-kube-api-access-khr6q\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836199 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-combined-ca-bundle\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836274 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836300 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6k7c\" (UniqueName: \"kubernetes.io/projected/da76d93d-7c2d-485e-b5e0-229f4254d74b-kube-api-access-m6k7c\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836338 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-config\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836468 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836521 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836536 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836560 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-combined-ca-bundle\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.836629 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.837425 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.839400 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-config\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.843500 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.846198 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.847997 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.849187 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.851682 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-combined-ca-bundle\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.860302 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data-custom\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.861112 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data-custom\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.861265 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.869524 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-combined-ca-bundle\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.873465 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhn24\" (UniqueName: \"kubernetes.io/projected/3fa8d357-cef3-43d1-8338-386d9880bb82-kube-api-access-xhn24\") pod \"heat-api-557f889856-kwzsw\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.890541 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khr6q\" (UniqueName: \"kubernetes.io/projected/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-kube-api-access-khr6q\") pod \"heat-cfnapi-6f6c4bddd6-xqtdm\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.891513 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6k7c\" (UniqueName: \"kubernetes.io/projected/da76d93d-7c2d-485e-b5e0-229f4254d74b-kube-api-access-m6k7c\") pod \"dnsmasq-dns-7756b9d78c-btn45\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:58 crc kubenswrapper[4886]: I0129 17:07:58.996009 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.032410 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.100429 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.510496 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.583599 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-svc\") pod \"77e77908-f078-4711-8c40-5e0bbda2a830\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.583752 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rwhj\" (UniqueName: \"kubernetes.io/projected/77e77908-f078-4711-8c40-5e0bbda2a830-kube-api-access-6rwhj\") pod \"77e77908-f078-4711-8c40-5e0bbda2a830\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.583810 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-nb\") pod \"77e77908-f078-4711-8c40-5e0bbda2a830\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.583889 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-sb\") pod \"77e77908-f078-4711-8c40-5e0bbda2a830\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.583944 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-swift-storage-0\") pod \"77e77908-f078-4711-8c40-5e0bbda2a830\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.584053 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-config\") pod \"77e77908-f078-4711-8c40-5e0bbda2a830\" (UID: \"77e77908-f078-4711-8c40-5e0bbda2a830\") " Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.600639 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e77908-f078-4711-8c40-5e0bbda2a830-kube-api-access-6rwhj" (OuterVolumeSpecName: "kube-api-access-6rwhj") pod "77e77908-f078-4711-8c40-5e0bbda2a830" (UID: "77e77908-f078-4711-8c40-5e0bbda2a830"). InnerVolumeSpecName "kube-api-access-6rwhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.604763 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rwhj\" (UniqueName: \"kubernetes.io/projected/77e77908-f078-4711-8c40-5e0bbda2a830-kube-api-access-6rwhj\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.663518 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.663829 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.721093 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-config" (OuterVolumeSpecName: "config") pod "77e77908-f078-4711-8c40-5e0bbda2a830" (UID: "77e77908-f078-4711-8c40-5e0bbda2a830"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.733950 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "77e77908-f078-4711-8c40-5e0bbda2a830" (UID: "77e77908-f078-4711-8c40-5e0bbda2a830"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.750511 4886 generic.go:334] "Generic (PLEG): container finished" podID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerID="3d38ab3f39b8f10e80b68dcbf56b94dd2483224e667fea1a1a75ada7c0ecf901" exitCode=0 Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.750548 4886 generic.go:334] "Generic (PLEG): container finished" podID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerID="dd01b92d286ab63ee03bff172b9b03aa69d2a7db780bc4a7761f9cf8e7790134" exitCode=0 Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.750606 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79744cfd-ecdc-42c4-b70e-bb957640a11c","Type":"ContainerDied","Data":"3d38ab3f39b8f10e80b68dcbf56b94dd2483224e667fea1a1a75ada7c0ecf901"} Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.750641 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79744cfd-ecdc-42c4-b70e-bb957640a11c","Type":"ContainerDied","Data":"dd01b92d286ab63ee03bff172b9b03aa69d2a7db780bc4a7761f9cf8e7790134"} Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.760338 4886 generic.go:334] "Generic (PLEG): container finished" podID="77e77908-f078-4711-8c40-5e0bbda2a830" containerID="53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8" exitCode=0 Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.760397 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" event={"ID":"77e77908-f078-4711-8c40-5e0bbda2a830","Type":"ContainerDied","Data":"53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8"} Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.760428 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" event={"ID":"77e77908-f078-4711-8c40-5e0bbda2a830","Type":"ContainerDied","Data":"00c8741e78cdef06ac95516aebc006fef061abb10bc976627d894974f2fc0223"} Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.760449 4886 scope.go:117] "RemoveContainer" containerID="53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.760623 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lbcqc" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.762265 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "77e77908-f078-4711-8c40-5e0bbda2a830" (UID: "77e77908-f078-4711-8c40-5e0bbda2a830"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.763959 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "77e77908-f078-4711-8c40-5e0bbda2a830" (UID: "77e77908-f078-4711-8c40-5e0bbda2a830"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.772589 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77e77908-f078-4711-8c40-5e0bbda2a830" (UID: "77e77908-f078-4711-8c40-5e0bbda2a830"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.809141 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.809168 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.809178 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.809187 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.809195 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77e77908-f078-4711-8c40-5e0bbda2a830-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.832798 4886 scope.go:117] "RemoveContainer" containerID="c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.884804 4886 scope.go:117] "RemoveContainer" containerID="53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8" Jan 29 17:07:59 crc kubenswrapper[4886]: E0129 17:07:59.885257 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8\": container with ID starting with 53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8 not found: ID does not exist" containerID="53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.885289 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8"} err="failed to get container status \"53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8\": rpc error: code = NotFound desc = could not find container \"53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8\": container with ID starting with 53ca240c0a66f67f4b44ce143c7902f3cc1ddf7f2d59ac9c55d73990e13de5e8 not found: ID does not exist" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.885309 4886 scope.go:117] "RemoveContainer" containerID="c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a" Jan 29 17:07:59 crc kubenswrapper[4886]: E0129 17:07:59.888862 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a\": container with ID starting with c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a not found: ID does not exist" containerID="c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a" Jan 29 17:07:59 crc kubenswrapper[4886]: I0129 17:07:59.888888 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a"} err="failed to get container status \"c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a\": rpc error: code = NotFound desc = could not find container \"c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a\": container with ID starting with c105784d4cb4a65b24766afa5c392562f921a5e8ba938bcdad19639f8052e82a not found: ID does not exist" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.054623 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.065895 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54f8bbfbf-9qjxm"] Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.209140 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lbcqc"] Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.224121 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79744cfd-ecdc-42c4-b70e-bb957640a11c-etc-machine-id\") pod \"79744cfd-ecdc-42c4-b70e-bb957640a11c\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.224272 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79744cfd-ecdc-42c4-b70e-bb957640a11c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "79744cfd-ecdc-42c4-b70e-bb957640a11c" (UID: "79744cfd-ecdc-42c4-b70e-bb957640a11c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.224385 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data-custom\") pod \"79744cfd-ecdc-42c4-b70e-bb957640a11c\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.224440 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-scripts\") pod \"79744cfd-ecdc-42c4-b70e-bb957640a11c\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.225266 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-combined-ca-bundle\") pod \"79744cfd-ecdc-42c4-b70e-bb957640a11c\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.225316 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data\") pod \"79744cfd-ecdc-42c4-b70e-bb957640a11c\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.225408 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrzlm\" (UniqueName: \"kubernetes.io/projected/79744cfd-ecdc-42c4-b70e-bb957640a11c-kube-api-access-zrzlm\") pod \"79744cfd-ecdc-42c4-b70e-bb957640a11c\" (UID: \"79744cfd-ecdc-42c4-b70e-bb957640a11c\") " Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.226139 4886 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79744cfd-ecdc-42c4-b70e-bb957640a11c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.236605 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79744cfd-ecdc-42c4-b70e-bb957640a11c-kube-api-access-zrzlm" (OuterVolumeSpecName: "kube-api-access-zrzlm") pod "79744cfd-ecdc-42c4-b70e-bb957640a11c" (UID: "79744cfd-ecdc-42c4-b70e-bb957640a11c"). InnerVolumeSpecName "kube-api-access-zrzlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.236720 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-scripts" (OuterVolumeSpecName: "scripts") pod "79744cfd-ecdc-42c4-b70e-bb957640a11c" (UID: "79744cfd-ecdc-42c4-b70e-bb957640a11c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.239962 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79744cfd-ecdc-42c4-b70e-bb957640a11c" (UID: "79744cfd-ecdc-42c4-b70e-bb957640a11c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.257020 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lbcqc"] Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.327870 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.327901 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.327911 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrzlm\" (UniqueName: \"kubernetes.io/projected/79744cfd-ecdc-42c4-b70e-bb957640a11c-kube-api-access-zrzlm\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.342556 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79744cfd-ecdc-42c4-b70e-bb957640a11c" (UID: "79744cfd-ecdc-42c4-b70e-bb957640a11c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.429944 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.492469 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data" (OuterVolumeSpecName: "config-data") pod "79744cfd-ecdc-42c4-b70e-bb957640a11c" (UID: "79744cfd-ecdc-42c4-b70e-bb957640a11c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.531735 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79744cfd-ecdc-42c4-b70e-bb957640a11c-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.649910 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77e77908-f078-4711-8c40-5e0bbda2a830" path="/var/lib/kubelet/pods/77e77908-f078-4711-8c40-5e0bbda2a830/volumes" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.675307 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-557f889856-kwzsw"] Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.705502 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6f6c4bddd6-xqtdm"] Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.740409 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-btn45"] Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.783736 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.784396 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="proxy-httpd" containerID="cri-o://44a3542db94b31c96db714bd6c3559bd3e1d7d7a66d633f86abe33fb9a6f4bd0" gracePeriod=30 Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.784771 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="sg-core" containerID="cri-o://9d8e62602d1305f37f8a51b73f2c104ca86a67a3331fc3d826d42ccf0fac24ce" gracePeriod=30 Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.784821 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="ceilometer-notification-agent" containerID="cri-o://1bdf46565ca1048aaf33d2e55676cc44132df701332d9cac871024cf7e0601b1" gracePeriod=30 Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.784741 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="ceilometer-central-agent" containerID="cri-o://472df94bcf2c9160f704fb8f0e7681c07c27ea44d994460b0bfef6434e9a5bfa" gracePeriod=30 Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.812694 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.812717 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79744cfd-ecdc-42c4-b70e-bb957640a11c","Type":"ContainerDied","Data":"eb5bacab0ef6b5257f3ba5127165c9496314e35a73af62c8e260a0b9866372e0"} Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.812809 4886 scope.go:117] "RemoveContainer" containerID="3d38ab3f39b8f10e80b68dcbf56b94dd2483224e667fea1a1a75ada7c0ecf901" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.821680 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" event={"ID":"da0e4cf4-a01f-48df-b61b-796c8bc9f60a","Type":"ContainerStarted","Data":"349855b0bf0483b72492372d5c1a6d697a135a4af893483f84d1a5f6df2c5a62"} Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.824359 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" event={"ID":"da76d93d-7c2d-485e-b5e0-229f4254d74b","Type":"ContainerStarted","Data":"bfc495e69c05d32911e1c19e2fff095c3d4fca06c566554a8f30f63272e3f284"} Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.826944 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54f8bbfbf-9qjxm" event={"ID":"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f","Type":"ContainerStarted","Data":"b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f"} Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.826972 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54f8bbfbf-9qjxm" event={"ID":"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f","Type":"ContainerStarted","Data":"0f319e6982b89bee08a0388a5eb4c63bb973328dc67504ccea174e9928171156"} Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.828202 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.840598 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-557f889856-kwzsw" event={"ID":"3fa8d357-cef3-43d1-8338-386d9880bb82","Type":"ContainerStarted","Data":"8e93f8d9b007e6405d2291aa2ff9660432275194b991846ebc2d8ccfab880ce5"} Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.871896 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.888619 4886 scope.go:117] "RemoveContainer" containerID="dd01b92d286ab63ee03bff172b9b03aa69d2a7db780bc4a7761f9cf8e7790134" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.918276 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:08:00 crc kubenswrapper[4886]: E0129 17:08:00.928781 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79744cfd_ecdc_42c4_b70e_bb957640a11c.slice/crio-eb5bacab0ef6b5257f3ba5127165c9496314e35a73af62c8e260a0b9866372e0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79744cfd_ecdc_42c4_b70e_bb957640a11c.slice\": RecentStats: unable to find data in memory cache]" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.935289 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:08:00 crc kubenswrapper[4886]: E0129 17:08:00.935818 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e77908-f078-4711-8c40-5e0bbda2a830" containerName="dnsmasq-dns" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.935842 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e77908-f078-4711-8c40-5e0bbda2a830" containerName="dnsmasq-dns" Jan 29 17:08:00 crc kubenswrapper[4886]: E0129 17:08:00.935874 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerName="cinder-scheduler" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.935883 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerName="cinder-scheduler" Jan 29 17:08:00 crc kubenswrapper[4886]: E0129 17:08:00.935899 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerName="probe" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.935907 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerName="probe" Jan 29 17:08:00 crc kubenswrapper[4886]: E0129 17:08:00.935923 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e77908-f078-4711-8c40-5e0bbda2a830" containerName="init" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.935929 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e77908-f078-4711-8c40-5e0bbda2a830" containerName="init" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.936129 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerName="probe" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.936149 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e77908-f078-4711-8c40-5e0bbda2a830" containerName="dnsmasq-dns" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.936159 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" containerName="cinder-scheduler" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.937558 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.953406 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-54f8bbfbf-9qjxm" podStartSLOduration=2.953389363 podStartE2EDuration="2.953389363s" podCreationTimestamp="2026-01-29 17:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:00.86809209 +0000 UTC m=+2763.776811352" watchObservedRunningTime="2026-01-29 17:08:00.953389363 +0000 UTC m=+2763.862108635" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.962722 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 29 17:08:00 crc kubenswrapper[4886]: I0129 17:08:00.968422 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.057160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.057525 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.057663 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.057735 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9b55479-5ea1-4a5b-9e34-e83313b04dec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.057813 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.058573 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2z69\" (UniqueName: \"kubernetes.io/projected/d9b55479-5ea1-4a5b-9e34-e83313b04dec-kube-api-access-s2z69\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.164133 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.164215 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.164247 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9b55479-5ea1-4a5b-9e34-e83313b04dec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.164275 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.164305 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2z69\" (UniqueName: \"kubernetes.io/projected/d9b55479-5ea1-4a5b-9e34-e83313b04dec-kube-api-access-s2z69\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.164450 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.165248 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9b55479-5ea1-4a5b-9e34-e83313b04dec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.171088 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.172631 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.173842 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.183848 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2z69\" (UniqueName: \"kubernetes.io/projected/d9b55479-5ea1-4a5b-9e34-e83313b04dec-kube-api-access-s2z69\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.193704 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b55479-5ea1-4a5b-9e34-e83313b04dec-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9b55479-5ea1-4a5b-9e34-e83313b04dec\") " pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.324514 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.810468 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.886618 4886 generic.go:334] "Generic (PLEG): container finished" podID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerID="44a3542db94b31c96db714bd6c3559bd3e1d7d7a66d633f86abe33fb9a6f4bd0" exitCode=0 Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.886651 4886 generic.go:334] "Generic (PLEG): container finished" podID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerID="9d8e62602d1305f37f8a51b73f2c104ca86a67a3331fc3d826d42ccf0fac24ce" exitCode=2 Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.886660 4886 generic.go:334] "Generic (PLEG): container finished" podID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerID="1bdf46565ca1048aaf33d2e55676cc44132df701332d9cac871024cf7e0601b1" exitCode=0 Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.886667 4886 generic.go:334] "Generic (PLEG): container finished" podID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerID="472df94bcf2c9160f704fb8f0e7681c07c27ea44d994460b0bfef6434e9a5bfa" exitCode=0 Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.886700 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerDied","Data":"44a3542db94b31c96db714bd6c3559bd3e1d7d7a66d633f86abe33fb9a6f4bd0"} Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.886746 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerDied","Data":"9d8e62602d1305f37f8a51b73f2c104ca86a67a3331fc3d826d42ccf0fac24ce"} Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.886758 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerDied","Data":"1bdf46565ca1048aaf33d2e55676cc44132df701332d9cac871024cf7e0601b1"} Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.886768 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerDied","Data":"472df94bcf2c9160f704fb8f0e7681c07c27ea44d994460b0bfef6434e9a5bfa"} Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.893271 4886 generic.go:334] "Generic (PLEG): container finished" podID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerID="aecb755c349be6f445700545d32b2d2a1cceeb8e44ce0b32e7f93655d8a60679" exitCode=0 Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.893389 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" event={"ID":"da76d93d-7c2d-485e-b5e0-229f4254d74b","Type":"ContainerDied","Data":"aecb755c349be6f445700545d32b2d2a1cceeb8e44ce0b32e7f93655d8a60679"} Jan 29 17:08:01 crc kubenswrapper[4886]: I0129 17:08:01.910766 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9b55479-5ea1-4a5b-9e34-e83313b04dec","Type":"ContainerStarted","Data":"90b34c7a69776956d3b5a18587107f777be1a70596c9cd0c0def826fbc244baa"} Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.161686 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-f458794ff-v7p92"] Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.179313 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-f458794ff-v7p92"] Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.179442 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.190995 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.191557 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.191914 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.227062 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7kr\" (UniqueName: \"kubernetes.io/projected/79c81ef9-65c7-4372-9a47-8ed93521eadf-kube-api-access-sc7kr\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.227108 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/79c81ef9-65c7-4372-9a47-8ed93521eadf-etc-swift\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.227196 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-combined-ca-bundle\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.227223 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79c81ef9-65c7-4372-9a47-8ed93521eadf-log-httpd\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.227252 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-config-data\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.227313 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-internal-tls-certs\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.227355 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79c81ef9-65c7-4372-9a47-8ed93521eadf-run-httpd\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.227371 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-public-tls-certs\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.332268 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7kr\" (UniqueName: \"kubernetes.io/projected/79c81ef9-65c7-4372-9a47-8ed93521eadf-kube-api-access-sc7kr\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.332610 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/79c81ef9-65c7-4372-9a47-8ed93521eadf-etc-swift\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.338996 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-combined-ca-bundle\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.339081 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79c81ef9-65c7-4372-9a47-8ed93521eadf-log-httpd\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.339146 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-config-data\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.339436 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-internal-tls-certs\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.339510 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79c81ef9-65c7-4372-9a47-8ed93521eadf-run-httpd\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.339546 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-public-tls-certs\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.339895 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79c81ef9-65c7-4372-9a47-8ed93521eadf-log-httpd\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.344093 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-combined-ca-bundle\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.344315 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79c81ef9-65c7-4372-9a47-8ed93521eadf-run-httpd\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.352277 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/79c81ef9-65c7-4372-9a47-8ed93521eadf-etc-swift\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.354464 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-public-tls-certs\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.354940 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-config-data\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.356929 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7kr\" (UniqueName: \"kubernetes.io/projected/79c81ef9-65c7-4372-9a47-8ed93521eadf-kube-api-access-sc7kr\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.359303 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c81ef9-65c7-4372-9a47-8ed93521eadf-internal-tls-certs\") pod \"swift-proxy-f458794ff-v7p92\" (UID: \"79c81ef9-65c7-4372-9a47-8ed93521eadf\") " pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.368451 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.449763 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/24e9fd03-4a7f-45c7-83e6-608ad7648766-kube-api-access-5kkf6\") pod \"24e9fd03-4a7f-45c7-83e6-608ad7648766\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.449833 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-run-httpd\") pod \"24e9fd03-4a7f-45c7-83e6-608ad7648766\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.450122 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-log-httpd\") pod \"24e9fd03-4a7f-45c7-83e6-608ad7648766\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.450300 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-sg-core-conf-yaml\") pod \"24e9fd03-4a7f-45c7-83e6-608ad7648766\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.450384 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-combined-ca-bundle\") pod \"24e9fd03-4a7f-45c7-83e6-608ad7648766\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.450420 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-config-data\") pod \"24e9fd03-4a7f-45c7-83e6-608ad7648766\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.450437 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-scripts\") pod \"24e9fd03-4a7f-45c7-83e6-608ad7648766\" (UID: \"24e9fd03-4a7f-45c7-83e6-608ad7648766\") " Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.454163 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "24e9fd03-4a7f-45c7-83e6-608ad7648766" (UID: "24e9fd03-4a7f-45c7-83e6-608ad7648766"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.455627 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "24e9fd03-4a7f-45c7-83e6-608ad7648766" (UID: "24e9fd03-4a7f-45c7-83e6-608ad7648766"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.460713 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e9fd03-4a7f-45c7-83e6-608ad7648766-kube-api-access-5kkf6" (OuterVolumeSpecName: "kube-api-access-5kkf6") pod "24e9fd03-4a7f-45c7-83e6-608ad7648766" (UID: "24e9fd03-4a7f-45c7-83e6-608ad7648766"). InnerVolumeSpecName "kube-api-access-5kkf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.484696 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-scripts" (OuterVolumeSpecName: "scripts") pod "24e9fd03-4a7f-45c7-83e6-608ad7648766" (UID: "24e9fd03-4a7f-45c7-83e6-608ad7648766"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.533881 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.586477 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.586524 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/24e9fd03-4a7f-45c7-83e6-608ad7648766-kube-api-access-5kkf6\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.586535 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.586543 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e9fd03-4a7f-45c7-83e6-608ad7648766-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.658499 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "24e9fd03-4a7f-45c7-83e6-608ad7648766" (UID: "24e9fd03-4a7f-45c7-83e6-608ad7648766"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.691487 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79744cfd-ecdc-42c4-b70e-bb957640a11c" path="/var/lib/kubelet/pods/79744cfd-ecdc-42c4-b70e-bb957640a11c/volumes" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.692377 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.780610 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24e9fd03-4a7f-45c7-83e6-608ad7648766" (UID: "24e9fd03-4a7f-45c7-83e6-608ad7648766"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.793837 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.838138 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-config-data" (OuterVolumeSpecName: "config-data") pod "24e9fd03-4a7f-45c7-83e6-608ad7648766" (UID: "24e9fd03-4a7f-45c7-83e6-608ad7648766"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.895429 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e9fd03-4a7f-45c7-83e6-608ad7648766-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.950784 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" event={"ID":"da76d93d-7c2d-485e-b5e0-229f4254d74b","Type":"ContainerStarted","Data":"d9ab37d44f372064ee89522913b27477d9c2a6f3f0efeec33809e585d943fe38"} Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.951151 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.965947 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.966793 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e9fd03-4a7f-45c7-83e6-608ad7648766","Type":"ContainerDied","Data":"92751cfdf549c65a3a37a865694b9ce91879a5f41c663c775080337b3acc7481"} Jan 29 17:08:02 crc kubenswrapper[4886]: I0129 17:08:02.966846 4886 scope.go:117] "RemoveContainer" containerID="44a3542db94b31c96db714bd6c3559bd3e1d7d7a66d633f86abe33fb9a6f4bd0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.010092 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" podStartSLOduration=5.01007263 podStartE2EDuration="5.01007263s" podCreationTimestamp="2026-01-29 17:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:02.979228672 +0000 UTC m=+2765.887947934" watchObservedRunningTime="2026-01-29 17:08:03.01007263 +0000 UTC m=+2765.918791902" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.011744 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.032029 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.076568 4886 scope.go:117] "RemoveContainer" containerID="9d8e62602d1305f37f8a51b73f2c104ca86a67a3331fc3d826d42ccf0fac24ce" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.094417 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:03 crc kubenswrapper[4886]: E0129 17:08:03.104728 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="proxy-httpd" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.104775 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="proxy-httpd" Jan 29 17:08:03 crc kubenswrapper[4886]: E0129 17:08:03.104832 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="ceilometer-notification-agent" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.104841 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="ceilometer-notification-agent" Jan 29 17:08:03 crc kubenswrapper[4886]: E0129 17:08:03.104880 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="ceilometer-central-agent" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.104887 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="ceilometer-central-agent" Jan 29 17:08:03 crc kubenswrapper[4886]: E0129 17:08:03.104912 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="sg-core" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.104919 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="sg-core" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.105576 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="sg-core" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.105616 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="ceilometer-central-agent" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.105643 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="ceilometer-notification-agent" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.105653 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" containerName="proxy-httpd" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.110319 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.113783 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.114171 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.130753 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.167231 4886 scope.go:117] "RemoveContainer" containerID="1bdf46565ca1048aaf33d2e55676cc44132df701332d9cac871024cf7e0601b1" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.202962 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt8hq\" (UniqueName: \"kubernetes.io/projected/e0ea79fe-a2e5-4861-be91-aba220b1b221-kube-api-access-rt8hq\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.203060 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-run-httpd\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.203189 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-scripts\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.203270 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.203378 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.203439 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-log-httpd\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.203498 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-config-data\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.262917 4886 scope.go:117] "RemoveContainer" containerID="472df94bcf2c9160f704fb8f0e7681c07c27ea44d994460b0bfef6434e9a5bfa" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.309682 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-run-httpd\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.309762 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-scripts\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.309805 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.309856 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.309876 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-log-httpd\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.309897 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-config-data\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.309977 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8hq\" (UniqueName: \"kubernetes.io/projected/e0ea79fe-a2e5-4861-be91-aba220b1b221-kube-api-access-rt8hq\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.310777 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-run-httpd\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.316004 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.325573 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-scripts\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.325808 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-log-httpd\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.335128 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt8hq\" (UniqueName: \"kubernetes.io/projected/e0ea79fe-a2e5-4861-be91-aba220b1b221-kube-api-access-rt8hq\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.335964 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.357686 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-config-data\") pod \"ceilometer-0\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.359855 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-f458794ff-v7p92"] Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.496813 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:03 crc kubenswrapper[4886]: I0129 17:08:03.992054 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9b55479-5ea1-4a5b-9e34-e83313b04dec","Type":"ContainerStarted","Data":"80305faab9c62eace7d4c1bdb3bb280453207a39ecf367613ce2d312e44454f2"} Jan 29 17:08:04 crc kubenswrapper[4886]: I0129 17:08:04.011337 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f458794ff-v7p92" event={"ID":"79c81ef9-65c7-4372-9a47-8ed93521eadf","Type":"ContainerStarted","Data":"5dbd6462c80bc5cade9d736da39f17d5f27d4a0e06bee0ed49ba5fb78b9bb1e7"} Jan 29 17:08:04 crc kubenswrapper[4886]: I0129 17:08:04.286827 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:04 crc kubenswrapper[4886]: I0129 17:08:04.644850 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24e9fd03-4a7f-45c7-83e6-608ad7648766" path="/var/lib/kubelet/pods/24e9fd03-4a7f-45c7-83e6-608ad7648766/volumes" Jan 29 17:08:05 crc kubenswrapper[4886]: I0129 17:08:05.034304 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f458794ff-v7p92" event={"ID":"79c81ef9-65c7-4372-9a47-8ed93521eadf","Type":"ContainerStarted","Data":"b04a5dbfb771cedc564c98fd3551b8ad5346c3b7c7de45d6fa5e9ae368e761db"} Jan 29 17:08:05 crc kubenswrapper[4886]: I0129 17:08:05.049442 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9b55479-5ea1-4a5b-9e34-e83313b04dec","Type":"ContainerStarted","Data":"4d77970ac02df85f6db6ea041b1b14f3281f397dd1d73b477c3ccbbd864b1c13"} Jan 29 17:08:05 crc kubenswrapper[4886]: I0129 17:08:05.089457 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.089436317 podStartE2EDuration="5.089436317s" podCreationTimestamp="2026-01-29 17:08:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:05.075708741 +0000 UTC m=+2767.984428023" watchObservedRunningTime="2026-01-29 17:08:05.089436317 +0000 UTC m=+2767.998155579" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.169994 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.325546 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 29 17:08:06 crc kubenswrapper[4886]: W0129 17:08:06.576172 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0ea79fe_a2e5_4861_be91_aba220b1b221.slice/crio-928834e62ea2e840bea0af8f378a7be863b8582e831ecb530090b696cd7380b1 WatchSource:0}: Error finding container 928834e62ea2e840bea0af8f378a7be863b8582e831ecb530090b696cd7380b1: Status 404 returned error can't find the container with id 928834e62ea2e840bea0af8f378a7be863b8582e831ecb530090b696cd7380b1 Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.767569 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5f6fd667fd-4s5hk"] Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.769987 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.799027 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-54985c87ff-g5725"] Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.801163 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.828209 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5f6fd667fd-4s5hk"] Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.845403 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqs6\" (UniqueName: \"kubernetes.io/projected/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-kube-api-access-8wqs6\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.845499 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-config-data\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.845582 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-combined-ca-bundle\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.845629 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-config-data-custom\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.865385 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-54985c87ff-g5725"] Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.885381 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6c7bddd46c-bnlxj"] Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.887095 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.934903 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6c7bddd46c-bnlxj"] Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.952496 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.954569 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.954600 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data-custom\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.954649 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-combined-ca-bundle\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.954731 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8rb\" (UniqueName: \"kubernetes.io/projected/04a4a757-71c6-46ec-9019-8d2f64be8285-kube-api-access-bb8rb\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.954757 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-config-data-custom\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.954942 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqs6\" (UniqueName: \"kubernetes.io/projected/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-kube-api-access-8wqs6\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.955011 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-combined-ca-bundle\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.955065 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data-custom\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.955143 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6cjb\" (UniqueName: \"kubernetes.io/projected/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-kube-api-access-p6cjb\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.955172 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-config-data\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.955205 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-combined-ca-bundle\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.978985 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-config-data-custom\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.979905 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-combined-ca-bundle\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:06 crc kubenswrapper[4886]: I0129 17:08:06.981109 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-config-data\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.033801 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqs6\" (UniqueName: \"kubernetes.io/projected/3b8fde91-2520-41c6-bc79-1f6b186dcbf0-kube-api-access-8wqs6\") pod \"heat-engine-5f6fd667fd-4s5hk\" (UID: \"3b8fde91-2520-41c6-bc79-1f6b186dcbf0\") " pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.060469 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data-custom\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.060673 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8rb\" (UniqueName: \"kubernetes.io/projected/04a4a757-71c6-46ec-9019-8d2f64be8285-kube-api-access-bb8rb\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.066095 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-combined-ca-bundle\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.066187 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data-custom\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.066391 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6cjb\" (UniqueName: \"kubernetes.io/projected/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-kube-api-access-p6cjb\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.066478 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-combined-ca-bundle\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.066615 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.066678 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.066986 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data-custom\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.085024 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data-custom\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.085919 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.086690 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-combined-ca-bundle\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.092378 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-combined-ca-bundle\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.093054 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.094407 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerStarted","Data":"928834e62ea2e840bea0af8f378a7be863b8582e831ecb530090b696cd7380b1"} Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.097606 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8rb\" (UniqueName: \"kubernetes.io/projected/04a4a757-71c6-46ec-9019-8d2f64be8285-kube-api-access-bb8rb\") pod \"heat-cfnapi-54985c87ff-g5725\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.105152 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6cjb\" (UniqueName: \"kubernetes.io/projected/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-kube-api-access-p6cjb\") pod \"heat-api-6c7bddd46c-bnlxj\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.147677 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.224017 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:07 crc kubenswrapper[4886]: I0129 17:08:07.235148 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:07.738841 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5f6fd667fd-4s5hk"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.146092 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" event={"ID":"da0e4cf4-a01f-48df-b61b-796c8bc9f60a","Type":"ContainerStarted","Data":"43336df2fcaf1b7acdf86423e30be9a3f4bd5a0f8198c273d550486720809b18"} Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.146455 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.171263 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f458794ff-v7p92" event={"ID":"79c81ef9-65c7-4372-9a47-8ed93521eadf","Type":"ContainerStarted","Data":"d13099f58927242dabf2518b9f0c1ef06941bb2bf99961324b02014accac3771"} Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.172554 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.172584 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.175237 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerStarted","Data":"5d0ddc2798e73cd33929ee945c72ef848dc6759a75fd9fcc95c2f939f265b877"} Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.196251 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5f6fd667fd-4s5hk" event={"ID":"3b8fde91-2520-41c6-bc79-1f6b186dcbf0","Type":"ContainerStarted","Data":"20d13db972e656bc190d452afe9dd4ec56d5a39d7d01657e5c9f210465635685"} Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.196285 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5f6fd667fd-4s5hk" event={"ID":"3b8fde91-2520-41c6-bc79-1f6b186dcbf0","Type":"ContainerStarted","Data":"e182f1bb7108c6a8e580c33036a302e948ed4477844a9f9bc581fc486d65f70b"} Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.197000 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.218296 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" podStartSLOduration=4.215341836 podStartE2EDuration="10.218258022s" podCreationTimestamp="2026-01-29 17:07:58 +0000 UTC" firstStartedPulling="2026-01-29 17:08:00.705976474 +0000 UTC m=+2763.614695746" lastFinishedPulling="2026-01-29 17:08:06.70889266 +0000 UTC m=+2769.617611932" observedRunningTime="2026-01-29 17:08:08.174255773 +0000 UTC m=+2771.082975065" watchObservedRunningTime="2026-01-29 17:08:08.218258022 +0000 UTC m=+2771.126977284" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.218901 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-557f889856-kwzsw" event={"ID":"3fa8d357-cef3-43d1-8338-386d9880bb82","Type":"ContainerStarted","Data":"69b0f3248bd2be75d1851a0e7878c496c05c0ca2dacd1bbce93fad67d36c48ff"} Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.220011 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.240450 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-f458794ff-v7p92" podStartSLOduration=6.240420366 podStartE2EDuration="6.240420366s" podCreationTimestamp="2026-01-29 17:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:08.210219046 +0000 UTC m=+2771.118938318" watchObservedRunningTime="2026-01-29 17:08:08.240420366 +0000 UTC m=+2771.149139638" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.262298 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5f6fd667fd-4s5hk" podStartSLOduration=2.262270732 podStartE2EDuration="2.262270732s" podCreationTimestamp="2026-01-29 17:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:08.225232589 +0000 UTC m=+2771.133951881" watchObservedRunningTime="2026-01-29 17:08:08.262270732 +0000 UTC m=+2771.170990004" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.294000 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-557f889856-kwzsw" podStartSLOduration=4.273907037 podStartE2EDuration="10.293976885s" podCreationTimestamp="2026-01-29 17:07:58 +0000 UTC" firstStartedPulling="2026-01-29 17:08:00.687602307 +0000 UTC m=+2763.596321579" lastFinishedPulling="2026-01-29 17:08:06.707672155 +0000 UTC m=+2769.616391427" observedRunningTime="2026-01-29 17:08:08.241855407 +0000 UTC m=+2771.150574689" watchObservedRunningTime="2026-01-29 17:08:08.293976885 +0000 UTC m=+2771.202696157" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:08.998623 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.075712 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-96hn8"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.075970 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerName="dnsmasq-dns" containerID="cri-o://705da8d91cb45e05b6aa5ab5b116ce8252bf3f498078113a7eee5edc1d206bca" gracePeriod=10 Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.160903 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6f6c4bddd6-xqtdm"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.190557 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7c65449fdf-42rxg"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.209117 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.213579 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7c65449fdf-42rxg"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.217294 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.217488 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.284723 4886 generic.go:334] "Generic (PLEG): container finished" podID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerID="705da8d91cb45e05b6aa5ab5b116ce8252bf3f498078113a7eee5edc1d206bca" exitCode=0 Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.285835 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" event={"ID":"80d171a6-11ab-4cdf-b469-acb56ff11735","Type":"ContainerDied","Data":"705da8d91cb45e05b6aa5ab5b116ce8252bf3f498078113a7eee5edc1d206bca"} Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.338695 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-public-tls-certs\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.338816 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-config-data\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.338880 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g88x6\" (UniqueName: \"kubernetes.io/projected/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-kube-api-access-g88x6\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.338906 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-internal-tls-certs\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.338935 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-combined-ca-bundle\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.338967 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-config-data-custom\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.398535 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-557f889856-kwzsw"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.410676 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-64bb5bfdfc-h2mgd"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.413216 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.423088 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-64bb5bfdfc-h2mgd"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.423478 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.423663 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.472417 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-f458794ff-v7p92" podUID="79c81ef9-65c7-4372-9a47-8ed93521eadf" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.492677 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-config-data-custom\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.492731 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjx4h\" (UniqueName: \"kubernetes.io/projected/a004f05d-8133-4d8e-9e3c-d5c9411351ad-kube-api-access-vjx4h\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.492829 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-public-tls-certs\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.492989 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-internal-tls-certs\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.493049 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-combined-ca-bundle\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.493131 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-config-data\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.493176 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-public-tls-certs\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.493238 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-config-data\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.493391 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g88x6\" (UniqueName: \"kubernetes.io/projected/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-kube-api-access-g88x6\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.493433 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-internal-tls-certs\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.493478 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-combined-ca-bundle\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.493532 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-config-data-custom\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.522807 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-config-data-custom\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.524023 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-internal-tls-certs\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.524788 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-combined-ca-bundle\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.536069 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-public-tls-certs\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.551457 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g88x6\" (UniqueName: \"kubernetes.io/projected/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-kube-api-access-g88x6\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.560855 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2-config-data\") pod \"heat-cfnapi-7c65449fdf-42rxg\" (UID: \"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2\") " pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.567122 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.613900 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-config-data-custom\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.613969 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjx4h\" (UniqueName: \"kubernetes.io/projected/a004f05d-8133-4d8e-9e3c-d5c9411351ad-kube-api-access-vjx4h\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.614049 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-internal-tls-certs\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.614082 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-combined-ca-bundle\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.614110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-public-tls-certs\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.614140 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-config-data\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.626231 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-internal-tls-certs\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.627179 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-config-data\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.627826 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-config-data-custom\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.640313 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-public-tls-certs\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.641285 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a004f05d-8133-4d8e-9e3c-d5c9411351ad-combined-ca-bundle\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.695250 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjx4h\" (UniqueName: \"kubernetes.io/projected/a004f05d-8133-4d8e-9e3c-d5c9411351ad-kube-api-access-vjx4h\") pod \"heat-api-64bb5bfdfc-h2mgd\" (UID: \"a004f05d-8133-4d8e-9e3c-d5c9411351ad\") " pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.702687 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-54985c87ff-g5725"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.745228 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6c7bddd46c-bnlxj"] Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.776775 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:09 crc kubenswrapper[4886]: I0129 17:08:09.869179 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.205:5353: connect: connection refused" Jan 29 17:08:10 crc kubenswrapper[4886]: I0129 17:08:10.180169 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:10 crc kubenswrapper[4886]: I0129 17:08:10.337485 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" podUID="da0e4cf4-a01f-48df-b61b-796c8bc9f60a" containerName="heat-cfnapi" containerID="cri-o://43336df2fcaf1b7acdf86423e30be9a3f4bd5a0f8198c273d550486720809b18" gracePeriod=60 Jan 29 17:08:10 crc kubenswrapper[4886]: I0129 17:08:10.812485 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:11 crc kubenswrapper[4886]: I0129 17:08:11.351427 4886 generic.go:334] "Generic (PLEG): container finished" podID="da0e4cf4-a01f-48df-b61b-796c8bc9f60a" containerID="43336df2fcaf1b7acdf86423e30be9a3f4bd5a0f8198c273d550486720809b18" exitCode=0 Jan 29 17:08:11 crc kubenswrapper[4886]: I0129 17:08:11.351528 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" event={"ID":"da0e4cf4-a01f-48df-b61b-796c8bc9f60a","Type":"ContainerDied","Data":"43336df2fcaf1b7acdf86423e30be9a3f4bd5a0f8198c273d550486720809b18"} Jan 29 17:08:11 crc kubenswrapper[4886]: I0129 17:08:11.351922 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-557f889856-kwzsw" podUID="3fa8d357-cef3-43d1-8338-386d9880bb82" containerName="heat-api" containerID="cri-o://69b0f3248bd2be75d1851a0e7878c496c05c0ca2dacd1bbce93fad67d36c48ff" gracePeriod=60 Jan 29 17:08:11 crc kubenswrapper[4886]: I0129 17:08:11.600859 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.200742 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vflxs"] Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.204844 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.219358 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vflxs"] Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.292203 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-utilities\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.292388 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tzn7\" (UniqueName: \"kubernetes.io/projected/18c5f721-30d1-48de-97e4-52399587c9d1-kube-api-access-2tzn7\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.292437 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-catalog-content\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.368712 4886 generic.go:334] "Generic (PLEG): container finished" podID="3fa8d357-cef3-43d1-8338-386d9880bb82" containerID="69b0f3248bd2be75d1851a0e7878c496c05c0ca2dacd1bbce93fad67d36c48ff" exitCode=0 Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.368771 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-557f889856-kwzsw" event={"ID":"3fa8d357-cef3-43d1-8338-386d9880bb82","Type":"ContainerDied","Data":"69b0f3248bd2be75d1851a0e7878c496c05c0ca2dacd1bbce93fad67d36c48ff"} Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.394384 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tzn7\" (UniqueName: \"kubernetes.io/projected/18c5f721-30d1-48de-97e4-52399587c9d1-kube-api-access-2tzn7\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.394469 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-catalog-content\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.394616 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-utilities\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.395192 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-utilities\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.395252 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-catalog-content\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.418394 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tzn7\" (UniqueName: \"kubernetes.io/projected/18c5f721-30d1-48de-97e4-52399587c9d1-kube-api-access-2tzn7\") pod \"certified-operators-vflxs\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.540585 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:12 crc kubenswrapper[4886]: I0129 17:08:12.553926 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-f458794ff-v7p92" Jan 29 17:08:14 crc kubenswrapper[4886]: I0129 17:08:14.034523 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" podUID="da0e4cf4-a01f-48df-b61b-796c8bc9f60a" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.228:8000/healthcheck\": dial tcp 10.217.0.228:8000: connect: connection refused" Jan 29 17:08:14 crc kubenswrapper[4886]: I0129 17:08:14.103284 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-557f889856-kwzsw" podUID="3fa8d357-cef3-43d1-8338-386d9880bb82" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.229:8004/healthcheck\": dial tcp 10.217.0.229:8004: connect: connection refused" Jan 29 17:08:14 crc kubenswrapper[4886]: I0129 17:08:14.869422 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.205:5353: connect: connection refused" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.194775 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.298476 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.330814 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data\") pod \"3fa8d357-cef3-43d1-8338-386d9880bb82\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.330870 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-combined-ca-bundle\") pod \"3fa8d357-cef3-43d1-8338-386d9880bb82\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.330902 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhn24\" (UniqueName: \"kubernetes.io/projected/3fa8d357-cef3-43d1-8338-386d9880bb82-kube-api-access-xhn24\") pod \"3fa8d357-cef3-43d1-8338-386d9880bb82\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.331042 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data-custom\") pod \"3fa8d357-cef3-43d1-8338-386d9880bb82\" (UID: \"3fa8d357-cef3-43d1-8338-386d9880bb82\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.345766 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3fa8d357-cef3-43d1-8338-386d9880bb82" (UID: "3fa8d357-cef3-43d1-8338-386d9880bb82"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.350478 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa8d357-cef3-43d1-8338-386d9880bb82-kube-api-access-xhn24" (OuterVolumeSpecName: "kube-api-access-xhn24") pod "3fa8d357-cef3-43d1-8338-386d9880bb82" (UID: "3fa8d357-cef3-43d1-8338-386d9880bb82"). InnerVolumeSpecName "kube-api-access-xhn24". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.381888 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.386023 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fa8d357-cef3-43d1-8338-386d9880bb82" (UID: "3fa8d357-cef3-43d1-8338-386d9880bb82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.419658 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data" (OuterVolumeSpecName: "config-data") pod "3fa8d357-cef3-43d1-8338-386d9880bb82" (UID: "3fa8d357-cef3-43d1-8338-386d9880bb82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.433206 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data-custom\") pod \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.433381 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-nb\") pod \"80d171a6-11ab-4cdf-b469-acb56ff11735\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.433481 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-svc\") pod \"80d171a6-11ab-4cdf-b469-acb56ff11735\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.433611 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-combined-ca-bundle\") pod \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.433638 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-sb\") pod \"80d171a6-11ab-4cdf-b469-acb56ff11735\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.433857 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data\") pod \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.434188 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bwl\" (UniqueName: \"kubernetes.io/projected/80d171a6-11ab-4cdf-b469-acb56ff11735-kube-api-access-t8bwl\") pod \"80d171a6-11ab-4cdf-b469-acb56ff11735\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.434235 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khr6q\" (UniqueName: \"kubernetes.io/projected/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-kube-api-access-khr6q\") pod \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\" (UID: \"da0e4cf4-a01f-48df-b61b-796c8bc9f60a\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.434297 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-swift-storage-0\") pod \"80d171a6-11ab-4cdf-b469-acb56ff11735\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.434398 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-config\") pod \"80d171a6-11ab-4cdf-b469-acb56ff11735\" (UID: \"80d171a6-11ab-4cdf-b469-acb56ff11735\") " Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.435552 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.435571 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.435587 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhn24\" (UniqueName: \"kubernetes.io/projected/3fa8d357-cef3-43d1-8338-386d9880bb82-kube-api-access-xhn24\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.435624 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa8d357-cef3-43d1-8338-386d9880bb82-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.436976 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da0e4cf4-a01f-48df-b61b-796c8bc9f60a" (UID: "da0e4cf4-a01f-48df-b61b-796c8bc9f60a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.445589 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-kube-api-access-khr6q" (OuterVolumeSpecName: "kube-api-access-khr6q") pod "da0e4cf4-a01f-48df-b61b-796c8bc9f60a" (UID: "da0e4cf4-a01f-48df-b61b-796c8bc9f60a"). InnerVolumeSpecName "kube-api-access-khr6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.447647 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d171a6-11ab-4cdf-b469-acb56ff11735-kube-api-access-t8bwl" (OuterVolumeSpecName: "kube-api-access-t8bwl") pod "80d171a6-11ab-4cdf-b469-acb56ff11735" (UID: "80d171a6-11ab-4cdf-b469-acb56ff11735"). InnerVolumeSpecName "kube-api-access-t8bwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.460433 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.460447 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f6c4bddd6-xqtdm" event={"ID":"da0e4cf4-a01f-48df-b61b-796c8bc9f60a","Type":"ContainerDied","Data":"349855b0bf0483b72492372d5c1a6d697a135a4af893483f84d1a5f6df2c5a62"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.460502 4886 scope.go:117] "RemoveContainer" containerID="43336df2fcaf1b7acdf86423e30be9a3f4bd5a0f8198c273d550486720809b18" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.474314 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c7bddd46c-bnlxj" event={"ID":"7b6ce536-47ec-45b9-b926-28f1fa7eb80a","Type":"ContainerStarted","Data":"961b09e7b27b7da7b2c511e013f3ab233e3894f45363e6e86d452b156483c7e5"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.474372 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c7bddd46c-bnlxj" event={"ID":"7b6ce536-47ec-45b9-b926-28f1fa7eb80a","Type":"ContainerStarted","Data":"28c29d3f5a45d8f6e82cfdb663ace90ab610bc4d1d57239fe93c946573d05d45"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.475996 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.479341 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da0e4cf4-a01f-48df-b61b-796c8bc9f60a" (UID: "da0e4cf4-a01f-48df-b61b-796c8bc9f60a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.480765 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54985c87ff-g5725" event={"ID":"04a4a757-71c6-46ec-9019-8d2f64be8285","Type":"ContainerStarted","Data":"d090a953dc19f1ee4b0424500aecfa717e2c4abdf9af4db4264c3428dc2d84f8"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.480800 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54985c87ff-g5725" event={"ID":"04a4a757-71c6-46ec-9019-8d2f64be8285","Type":"ContainerStarted","Data":"7f461b34367fc19b6002113f40bc4d964e2fb98d4e2fb8a58fd1680309b095e9"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.481815 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.507863 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"be43aab6-3888-4260-a85c-147e2ae0a36d","Type":"ContainerStarted","Data":"a238adb9e047d62411d78f0b37ed4276b323e2049accd30dfa5c15023aeaa6e5"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.524623 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerStarted","Data":"463c890cb672987e4db62f57b14305282dced80284ec2842a2e3a25befe23bf9"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.527087 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-557f889856-kwzsw" event={"ID":"3fa8d357-cef3-43d1-8338-386d9880bb82","Type":"ContainerDied","Data":"8e93f8d9b007e6405d2291aa2ff9660432275194b991846ebc2d8ccfab880ce5"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.527143 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-557f889856-kwzsw" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.527153 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80d171a6-11ab-4cdf-b469-acb56ff11735" (UID: "80d171a6-11ab-4cdf-b469-acb56ff11735"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.527729 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "80d171a6-11ab-4cdf-b469-acb56ff11735" (UID: "80d171a6-11ab-4cdf-b469-acb56ff11735"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.552161 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bwl\" (UniqueName: \"kubernetes.io/projected/80d171a6-11ab-4cdf-b469-acb56ff11735-kube-api-access-t8bwl\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.552188 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khr6q\" (UniqueName: \"kubernetes.io/projected/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-kube-api-access-khr6q\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.552198 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.552207 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.552218 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.552227 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.555190 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" event={"ID":"80d171a6-11ab-4cdf-b469-acb56ff11735","Type":"ContainerDied","Data":"81bf0e642c0dbb7fd724006f0c2c518606f7b43d2584453df92bcfe55b829357"} Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.555297 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-96hn8" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.563197 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6c7bddd46c-bnlxj" podStartSLOduration=11.563180157 podStartE2EDuration="11.563180157s" podCreationTimestamp="2026-01-29 17:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:17.508396734 +0000 UTC m=+2780.417116006" watchObservedRunningTime="2026-01-29 17:08:17.563180157 +0000 UTC m=+2780.471899429" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.576892 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-54985c87ff-g5725" podStartSLOduration=11.576875763 podStartE2EDuration="11.576875763s" podCreationTimestamp="2026-01-29 17:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:17.5586728 +0000 UTC m=+2780.467392072" watchObservedRunningTime="2026-01-29 17:08:17.576875763 +0000 UTC m=+2780.485595035" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.579547 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data" (OuterVolumeSpecName: "config-data") pod "da0e4cf4-a01f-48df-b61b-796c8bc9f60a" (UID: "da0e4cf4-a01f-48df-b61b-796c8bc9f60a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.583874 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "80d171a6-11ab-4cdf-b469-acb56ff11735" (UID: "80d171a6-11ab-4cdf-b469-acb56ff11735"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.618838 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "80d171a6-11ab-4cdf-b469-acb56ff11735" (UID: "80d171a6-11ab-4cdf-b469-acb56ff11735"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.629236 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.702422509 podStartE2EDuration="27.629215137s" podCreationTimestamp="2026-01-29 17:07:50 +0000 UTC" firstStartedPulling="2026-01-29 17:07:51.725421842 +0000 UTC m=+2754.634141114" lastFinishedPulling="2026-01-29 17:08:16.65221448 +0000 UTC m=+2779.560933742" observedRunningTime="2026-01-29 17:08:17.58778299 +0000 UTC m=+2780.496502262" watchObservedRunningTime="2026-01-29 17:08:17.629215137 +0000 UTC m=+2780.537934409" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.644241 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.644549 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerName="glance-log" containerID="cri-o://685691dd71892e3462a49d43e961e4398610edbd2ff6858db714971fb73711e6" gracePeriod=30 Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.645268 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerName="glance-httpd" containerID="cri-o://5e2f27254ecaeae6872715e18449eaa22b877597c8124da7a49920ec97100c5d" gracePeriod=30 Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.651086 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-config" (OuterVolumeSpecName: "config") pod "80d171a6-11ab-4cdf-b469-acb56ff11735" (UID: "80d171a6-11ab-4cdf-b469-acb56ff11735"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.669110 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.669141 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da0e4cf4-a01f-48df-b61b-796c8bc9f60a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.669151 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.669160 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d171a6-11ab-4cdf-b469-acb56ff11735-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.683870 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7c65449fdf-42rxg"] Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.696458 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vflxs"] Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.710281 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-64bb5bfdfc-h2mgd"] Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.944707 4886 scope.go:117] "RemoveContainer" containerID="69b0f3248bd2be75d1851a0e7878c496c05c0ca2dacd1bbce93fad67d36c48ff" Jan 29 17:08:17 crc kubenswrapper[4886]: I0129 17:08:17.994133 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-557f889856-kwzsw"] Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.026318 4886 scope.go:117] "RemoveContainer" containerID="705da8d91cb45e05b6aa5ab5b116ce8252bf3f498078113a7eee5edc1d206bca" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.027713 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-557f889856-kwzsw"] Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.045985 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-96hn8"] Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.089686 4886 scope.go:117] "RemoveContainer" containerID="26aa10c89bd28f4d17b03fabdd3c3dd7d4b1ab633d533650ee03163b7c656cd5" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.089840 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-96hn8"] Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.089871 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6f6c4bddd6-xqtdm"] Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.108335 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6f6c4bddd6-xqtdm"] Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.579506 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-64bb5bfdfc-h2mgd" event={"ID":"a004f05d-8133-4d8e-9e3c-d5c9411351ad","Type":"ContainerStarted","Data":"9ba203f1577fa4a2278281eb05f99b6b37f54638178327c02a931842f3130f2d"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.579788 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-64bb5bfdfc-h2mgd" event={"ID":"a004f05d-8133-4d8e-9e3c-d5c9411351ad","Type":"ContainerStarted","Data":"18d920fdb752d4bed66e2d78d64074a05d8a6665fcb8abfb885f6d42e0a27fe6"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.581176 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.591816 4886 generic.go:334] "Generic (PLEG): container finished" podID="04a4a757-71c6-46ec-9019-8d2f64be8285" containerID="d090a953dc19f1ee4b0424500aecfa717e2c4abdf9af4db4264c3428dc2d84f8" exitCode=1 Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.591872 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54985c87ff-g5725" event={"ID":"04a4a757-71c6-46ec-9019-8d2f64be8285","Type":"ContainerDied","Data":"d090a953dc19f1ee4b0424500aecfa717e2c4abdf9af4db4264c3428dc2d84f8"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.592533 4886 scope.go:117] "RemoveContainer" containerID="d090a953dc19f1ee4b0424500aecfa717e2c4abdf9af4db4264c3428dc2d84f8" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.597645 4886 generic.go:334] "Generic (PLEG): container finished" podID="18c5f721-30d1-48de-97e4-52399587c9d1" containerID="be55140e95fb2c7fd3a46b1ece79fa3d9132da294caa5ac8edf498151a8ce0b2" exitCode=0 Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.597710 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vflxs" event={"ID":"18c5f721-30d1-48de-97e4-52399587c9d1","Type":"ContainerDied","Data":"be55140e95fb2c7fd3a46b1ece79fa3d9132da294caa5ac8edf498151a8ce0b2"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.597739 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vflxs" event={"ID":"18c5f721-30d1-48de-97e4-52399587c9d1","Type":"ContainerStarted","Data":"fe354152829de757ca5537dde1fd3cfc8eb62b13a98c62b74ae6e9f6ed2f435c"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.607681 4886 generic.go:334] "Generic (PLEG): container finished" podID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerID="685691dd71892e3462a49d43e961e4398610edbd2ff6858db714971fb73711e6" exitCode=143 Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.607764 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"849de0d3-3456-44c2-bef4-3a435e4a432a","Type":"ContainerDied","Data":"685691dd71892e3462a49d43e961e4398610edbd2ff6858db714971fb73711e6"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.608950 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-64bb5bfdfc-h2mgd" podStartSLOduration=9.608929532 podStartE2EDuration="9.608929532s" podCreationTimestamp="2026-01-29 17:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:18.600614637 +0000 UTC m=+2781.509333919" watchObservedRunningTime="2026-01-29 17:08:18.608929532 +0000 UTC m=+2781.517648804" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.612371 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerStarted","Data":"d07a1d9b916e4f3e7a8a1402794315d10d0fa212b37288654a33188aff743885"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.637912 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa8d357-cef3-43d1-8338-386d9880bb82" path="/var/lib/kubelet/pods/3fa8d357-cef3-43d1-8338-386d9880bb82/volumes" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.638499 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" path="/var/lib/kubelet/pods/80d171a6-11ab-4cdf-b469-acb56ff11735/volumes" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.639081 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0e4cf4-a01f-48df-b61b-796c8bc9f60a" path="/var/lib/kubelet/pods/da0e4cf4-a01f-48df-b61b-796c8bc9f60a/volumes" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.639988 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7c65449fdf-42rxg" event={"ID":"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2","Type":"ContainerStarted","Data":"6958576a6365fc34d774dc5015cbac18d99aa6811ed0a85bec28185deabe80bb"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.640034 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.640047 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7c65449fdf-42rxg" event={"ID":"c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2","Type":"ContainerStarted","Data":"5eed4ad641d5dcf9de58cae60e50f69e712d0406c6aff33afb9c67bd75e5be40"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.651739 4886 generic.go:334] "Generic (PLEG): container finished" podID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" containerID="961b09e7b27b7da7b2c511e013f3ab233e3894f45363e6e86d452b156483c7e5" exitCode=1 Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.652432 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c7bddd46c-bnlxj" event={"ID":"7b6ce536-47ec-45b9-b926-28f1fa7eb80a","Type":"ContainerDied","Data":"961b09e7b27b7da7b2c511e013f3ab233e3894f45363e6e86d452b156483c7e5"} Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.652731 4886 scope.go:117] "RemoveContainer" containerID="961b09e7b27b7da7b2c511e013f3ab233e3894f45363e6e86d452b156483c7e5" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.687088 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7c65449fdf-42rxg" podStartSLOduration=9.687073813 podStartE2EDuration="9.687073813s" podCreationTimestamp="2026-01-29 17:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:18.657535361 +0000 UTC m=+2781.566254633" watchObservedRunningTime="2026-01-29 17:08:18.687073813 +0000 UTC m=+2781.595793085" Jan 29 17:08:18 crc kubenswrapper[4886]: I0129 17:08:18.875689 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.523005 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-n9fr6"] Jan 29 17:08:19 crc kubenswrapper[4886]: E0129 17:08:19.523845 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerName="init" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.523865 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerName="init" Jan 29 17:08:19 crc kubenswrapper[4886]: E0129 17:08:19.523888 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerName="dnsmasq-dns" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.523895 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerName="dnsmasq-dns" Jan 29 17:08:19 crc kubenswrapper[4886]: E0129 17:08:19.523919 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0e4cf4-a01f-48df-b61b-796c8bc9f60a" containerName="heat-cfnapi" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.523925 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0e4cf4-a01f-48df-b61b-796c8bc9f60a" containerName="heat-cfnapi" Jan 29 17:08:19 crc kubenswrapper[4886]: E0129 17:08:19.523935 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa8d357-cef3-43d1-8338-386d9880bb82" containerName="heat-api" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.523944 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa8d357-cef3-43d1-8338-386d9880bb82" containerName="heat-api" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.524188 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0e4cf4-a01f-48df-b61b-796c8bc9f60a" containerName="heat-cfnapi" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.524223 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d171a6-11ab-4cdf-b469-acb56ff11735" containerName="dnsmasq-dns" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.524234 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa8d357-cef3-43d1-8338-386d9880bb82" containerName="heat-api" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.525405 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.544792 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-n9fr6"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.622041 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6jmdx"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.623685 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.640892 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6jmdx"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.653387 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4e9f-account-create-update-sdhth"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.655564 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.657771 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.667144 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4e9f-account-create-update-sdhth"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.671155 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea6c4698-f001-402f-91e3-1e80bc7bf443-operator-scripts\") pod \"nova-api-db-create-n9fr6\" (UID: \"ea6c4698-f001-402f-91e3-1e80bc7bf443\") " pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.671212 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxnqt\" (UniqueName: \"kubernetes.io/projected/ea6c4698-f001-402f-91e3-1e80bc7bf443-kube-api-access-gxnqt\") pod \"nova-api-db-create-n9fr6\" (UID: \"ea6c4698-f001-402f-91e3-1e80bc7bf443\") " pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.673159 4886 generic.go:334] "Generic (PLEG): container finished" podID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" containerID="2eb9aac70b8d95e0c6e925aa406b960e03929e9d6915153ce56a560a835d977d" exitCode=1 Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.673237 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c7bddd46c-bnlxj" event={"ID":"7b6ce536-47ec-45b9-b926-28f1fa7eb80a","Type":"ContainerDied","Data":"2eb9aac70b8d95e0c6e925aa406b960e03929e9d6915153ce56a560a835d977d"} Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.673269 4886 scope.go:117] "RemoveContainer" containerID="961b09e7b27b7da7b2c511e013f3ab233e3894f45363e6e86d452b156483c7e5" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.674030 4886 scope.go:117] "RemoveContainer" containerID="2eb9aac70b8d95e0c6e925aa406b960e03929e9d6915153ce56a560a835d977d" Jan 29 17:08:19 crc kubenswrapper[4886]: E0129 17:08:19.674405 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6c7bddd46c-bnlxj_openstack(7b6ce536-47ec-45b9-b926-28f1fa7eb80a)\"" pod="openstack/heat-api-6c7bddd46c-bnlxj" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.704571 4886 generic.go:334] "Generic (PLEG): container finished" podID="04a4a757-71c6-46ec-9019-8d2f64be8285" containerID="269b4adc6e6be10392170084dc412e856cfe62aa07302ce9122a8ed94105dabe" exitCode=1 Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.706468 4886 scope.go:117] "RemoveContainer" containerID="269b4adc6e6be10392170084dc412e856cfe62aa07302ce9122a8ed94105dabe" Jan 29 17:08:19 crc kubenswrapper[4886]: E0129 17:08:19.706685 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54985c87ff-g5725_openstack(04a4a757-71c6-46ec-9019-8d2f64be8285)\"" pod="openstack/heat-cfnapi-54985c87ff-g5725" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.706714 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54985c87ff-g5725" event={"ID":"04a4a757-71c6-46ec-9019-8d2f64be8285","Type":"ContainerDied","Data":"269b4adc6e6be10392170084dc412e856cfe62aa07302ce9122a8ed94105dabe"} Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.774811 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-operator-scripts\") pod \"nova-cell0-db-create-6jmdx\" (UID: \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\") " pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.776104 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-operator-scripts\") pod \"nova-api-4e9f-account-create-update-sdhth\" (UID: \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\") " pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.776219 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea6c4698-f001-402f-91e3-1e80bc7bf443-operator-scripts\") pod \"nova-api-db-create-n9fr6\" (UID: \"ea6c4698-f001-402f-91e3-1e80bc7bf443\") " pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.776265 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxnqt\" (UniqueName: \"kubernetes.io/projected/ea6c4698-f001-402f-91e3-1e80bc7bf443-kube-api-access-gxnqt\") pod \"nova-api-db-create-n9fr6\" (UID: \"ea6c4698-f001-402f-91e3-1e80bc7bf443\") " pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.776381 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkt66\" (UniqueName: \"kubernetes.io/projected/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-kube-api-access-pkt66\") pod \"nova-cell0-db-create-6jmdx\" (UID: \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\") " pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.776549 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9n6n\" (UniqueName: \"kubernetes.io/projected/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-kube-api-access-r9n6n\") pod \"nova-api-4e9f-account-create-update-sdhth\" (UID: \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\") " pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.778729 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea6c4698-f001-402f-91e3-1e80bc7bf443-operator-scripts\") pod \"nova-api-db-create-n9fr6\" (UID: \"ea6c4698-f001-402f-91e3-1e80bc7bf443\") " pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.782771 4886 scope.go:117] "RemoveContainer" containerID="d090a953dc19f1ee4b0424500aecfa717e2c4abdf9af4db4264c3428dc2d84f8" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.828145 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vqrmb"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.829876 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.854424 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxnqt\" (UniqueName: \"kubernetes.io/projected/ea6c4698-f001-402f-91e3-1e80bc7bf443-kube-api-access-gxnqt\") pod \"nova-api-db-create-n9fr6\" (UID: \"ea6c4698-f001-402f-91e3-1e80bc7bf443\") " pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.884414 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vqrmb"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.884460 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9n6n\" (UniqueName: \"kubernetes.io/projected/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-kube-api-access-r9n6n\") pod \"nova-api-4e9f-account-create-update-sdhth\" (UID: \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\") " pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.884881 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-operator-scripts\") pod \"nova-cell0-db-create-6jmdx\" (UID: \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\") " pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.884990 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-operator-scripts\") pod \"nova-api-4e9f-account-create-update-sdhth\" (UID: \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\") " pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.885121 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkt66\" (UniqueName: \"kubernetes.io/projected/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-kube-api-access-pkt66\") pod \"nova-cell0-db-create-6jmdx\" (UID: \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\") " pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.886779 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-operator-scripts\") pod \"nova-cell0-db-create-6jmdx\" (UID: \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\") " pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.887759 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-operator-scripts\") pod \"nova-api-4e9f-account-create-update-sdhth\" (UID: \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\") " pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.901309 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cc0e-account-create-update-nxk7k"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.903133 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.904766 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.907026 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9n6n\" (UniqueName: \"kubernetes.io/projected/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-kube-api-access-r9n6n\") pod \"nova-api-4e9f-account-create-update-sdhth\" (UID: \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\") " pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.907961 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkt66\" (UniqueName: \"kubernetes.io/projected/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-kube-api-access-pkt66\") pod \"nova-cell0-db-create-6jmdx\" (UID: \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\") " pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.937939 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc0e-account-create-update-nxk7k"] Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.946102 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.978299 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.987035 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0772ac7-3374-4607-a644-f4ac2e1c078a-operator-scripts\") pod \"nova-cell1-db-create-vqrmb\" (UID: \"d0772ac7-3374-4607-a644-f4ac2e1c078a\") " pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:19 crc kubenswrapper[4886]: I0129 17:08:19.987259 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtmbn\" (UniqueName: \"kubernetes.io/projected/d0772ac7-3374-4607-a644-f4ac2e1c078a-kube-api-access-jtmbn\") pod \"nova-cell1-db-create-vqrmb\" (UID: \"d0772ac7-3374-4607-a644-f4ac2e1c078a\") " pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.045927 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f9c8-account-create-update-hcc42"] Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.047563 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.056894 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.061548 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f9c8-account-create-update-hcc42"] Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.098678 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtmbn\" (UniqueName: \"kubernetes.io/projected/d0772ac7-3374-4607-a644-f4ac2e1c078a-kube-api-access-jtmbn\") pod \"nova-cell1-db-create-vqrmb\" (UID: \"d0772ac7-3374-4607-a644-f4ac2e1c078a\") " pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.103825 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af00928-6484-4071-b739-bc211ac220ef-operator-scripts\") pod \"nova-cell0-cc0e-account-create-update-nxk7k\" (UID: \"6af00928-6484-4071-b739-bc211ac220ef\") " pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.104099 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0772ac7-3374-4607-a644-f4ac2e1c078a-operator-scripts\") pod \"nova-cell1-db-create-vqrmb\" (UID: \"d0772ac7-3374-4607-a644-f4ac2e1c078a\") " pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.113710 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk62r\" (UniqueName: \"kubernetes.io/projected/6af00928-6484-4071-b739-bc211ac220ef-kube-api-access-pk62r\") pod \"nova-cell0-cc0e-account-create-update-nxk7k\" (UID: \"6af00928-6484-4071-b739-bc211ac220ef\") " pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.114894 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0772ac7-3374-4607-a644-f4ac2e1c078a-operator-scripts\") pod \"nova-cell1-db-create-vqrmb\" (UID: \"d0772ac7-3374-4607-a644-f4ac2e1c078a\") " pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.125710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtmbn\" (UniqueName: \"kubernetes.io/projected/d0772ac7-3374-4607-a644-f4ac2e1c078a-kube-api-access-jtmbn\") pod \"nova-cell1-db-create-vqrmb\" (UID: \"d0772ac7-3374-4607-a644-f4ac2e1c078a\") " pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.141539 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.219784 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk62r\" (UniqueName: \"kubernetes.io/projected/6af00928-6484-4071-b739-bc211ac220ef-kube-api-access-pk62r\") pod \"nova-cell0-cc0e-account-create-update-nxk7k\" (UID: \"6af00928-6484-4071-b739-bc211ac220ef\") " pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.219948 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af00928-6484-4071-b739-bc211ac220ef-operator-scripts\") pod \"nova-cell0-cc0e-account-create-update-nxk7k\" (UID: \"6af00928-6484-4071-b739-bc211ac220ef\") " pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.219985 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-operator-scripts\") pod \"nova-cell1-f9c8-account-create-update-hcc42\" (UID: \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\") " pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.220147 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlwld\" (UniqueName: \"kubernetes.io/projected/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-kube-api-access-tlwld\") pod \"nova-cell1-f9c8-account-create-update-hcc42\" (UID: \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\") " pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.221183 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af00928-6484-4071-b739-bc211ac220ef-operator-scripts\") pod \"nova-cell0-cc0e-account-create-update-nxk7k\" (UID: \"6af00928-6484-4071-b739-bc211ac220ef\") " pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.241032 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk62r\" (UniqueName: \"kubernetes.io/projected/6af00928-6484-4071-b739-bc211ac220ef-kube-api-access-pk62r\") pod \"nova-cell0-cc0e-account-create-update-nxk7k\" (UID: \"6af00928-6484-4071-b739-bc211ac220ef\") " pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.325734 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-operator-scripts\") pod \"nova-cell1-f9c8-account-create-update-hcc42\" (UID: \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\") " pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.325874 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlwld\" (UniqueName: \"kubernetes.io/projected/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-kube-api-access-tlwld\") pod \"nova-cell1-f9c8-account-create-update-hcc42\" (UID: \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\") " pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.326953 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-operator-scripts\") pod \"nova-cell1-f9c8-account-create-update-hcc42\" (UID: \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\") " pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.349083 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlwld\" (UniqueName: \"kubernetes.io/projected/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-kube-api-access-tlwld\") pod \"nova-cell1-f9c8-account-create-update-hcc42\" (UID: \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\") " pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.351299 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.381785 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.405462 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.427406 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.427622 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerName="glance-log" containerID="cri-o://d46a9e5456f252ab3dd8ef0ca224f83e7f91449851fd433a23e9070eb20e028e" gracePeriod=30 Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.427756 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerName="glance-httpd" containerID="cri-o://819d3c493df902007da456da0899d275e457a2f0ed2e48aedaf84f652820cb61" gracePeriod=30 Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.782374 4886 scope.go:117] "RemoveContainer" containerID="2eb9aac70b8d95e0c6e925aa406b960e03929e9d6915153ce56a560a835d977d" Jan 29 17:08:20 crc kubenswrapper[4886]: E0129 17:08:20.783290 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6c7bddd46c-bnlxj_openstack(7b6ce536-47ec-45b9-b926-28f1fa7eb80a)\"" pod="openstack/heat-api-6c7bddd46c-bnlxj" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.791351 4886 scope.go:117] "RemoveContainer" containerID="269b4adc6e6be10392170084dc412e856cfe62aa07302ce9122a8ed94105dabe" Jan 29 17:08:20 crc kubenswrapper[4886]: E0129 17:08:20.791612 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54985c87ff-g5725_openstack(04a4a757-71c6-46ec-9019-8d2f64be8285)\"" pod="openstack/heat-cfnapi-54985c87ff-g5725" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.802542 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vflxs" event={"ID":"18c5f721-30d1-48de-97e4-52399587c9d1","Type":"ContainerStarted","Data":"afb5da406ee3b16e59af7913d87b7d9742dbcfd595f22b00884d57064f6bdef1"} Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.822591 4886 generic.go:334] "Generic (PLEG): container finished" podID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerID="d46a9e5456f252ab3dd8ef0ca224f83e7f91449851fd433a23e9070eb20e028e" exitCode=143 Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.823510 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf","Type":"ContainerDied","Data":"d46a9e5456f252ab3dd8ef0ca224f83e7f91449851fd433a23e9070eb20e028e"} Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.896694 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6jmdx"] Jan 29 17:08:20 crc kubenswrapper[4886]: I0129 17:08:20.939730 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4e9f-account-create-update-sdhth"] Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.014900 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-n9fr6"] Jan 29 17:08:21 crc kubenswrapper[4886]: W0129 17:08:21.091410 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea6c4698_f001_402f_91e3_1e80bc7bf443.slice/crio-97aa039de70a06170f71988b76c9396909f3b7178da4b75eb9a0fd7d820bb21d WatchSource:0}: Error finding container 97aa039de70a06170f71988b76c9396909f3b7178da4b75eb9a0fd7d820bb21d: Status 404 returned error can't find the container with id 97aa039de70a06170f71988b76c9396909f3b7178da4b75eb9a0fd7d820bb21d Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.569690 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vqrmb"] Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.593912 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f9c8-account-create-update-hcc42"] Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.620104 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc0e-account-create-update-nxk7k"] Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.855144 4886 generic.go:334] "Generic (PLEG): container finished" podID="0abefc39-4eb0-4600-8e11-b5d4af3c11b4" containerID="8cff761f0cac80358e499809ffa647d36a191c7af1a493dc00f71f33ae4223f1" exitCode=0 Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.855856 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6jmdx" event={"ID":"0abefc39-4eb0-4600-8e11-b5d4af3c11b4","Type":"ContainerDied","Data":"8cff761f0cac80358e499809ffa647d36a191c7af1a493dc00f71f33ae4223f1"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.855973 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6jmdx" event={"ID":"0abefc39-4eb0-4600-8e11-b5d4af3c11b4","Type":"ContainerStarted","Data":"1d6d2eb795c39ee31f6bd0a881882b56df9889d142ea82ed82c62281b1f67996"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.861647 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" event={"ID":"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb","Type":"ContainerStarted","Data":"e1eabc32a80d150906ee8042c9b91dd9d3a691eb3e8f2321170f2610258d0695"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.868024 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n9fr6" event={"ID":"ea6c4698-f001-402f-91e3-1e80bc7bf443","Type":"ContainerStarted","Data":"92b4d1b2f475024d893ea29a83366ecc7f80ef2e9282821adbce174622472058"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.868068 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n9fr6" event={"ID":"ea6c4698-f001-402f-91e3-1e80bc7bf443","Type":"ContainerStarted","Data":"97aa039de70a06170f71988b76c9396909f3b7178da4b75eb9a0fd7d820bb21d"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.876576 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" event={"ID":"6af00928-6484-4071-b739-bc211ac220ef","Type":"ContainerStarted","Data":"91c7222c3b9f7d5be92754c25f343aeff5c1732b0217924a2ad1edc9eaf57e78"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.891865 4886 generic.go:334] "Generic (PLEG): container finished" podID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerID="5e2f27254ecaeae6872715e18449eaa22b877597c8124da7a49920ec97100c5d" exitCode=0 Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.891986 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"849de0d3-3456-44c2-bef4-3a435e4a432a","Type":"ContainerDied","Data":"5e2f27254ecaeae6872715e18449eaa22b877597c8124da7a49920ec97100c5d"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.917524 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerStarted","Data":"97f8f5e0387fde773bf154bf18b428f934c3b6dd32a6b73bb76a513b5a291c63"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.918156 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="ceilometer-central-agent" containerID="cri-o://5d0ddc2798e73cd33929ee945c72ef848dc6759a75fd9fcc95c2f939f265b877" gracePeriod=30 Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.918547 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.918592 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="proxy-httpd" containerID="cri-o://97f8f5e0387fde773bf154bf18b428f934c3b6dd32a6b73bb76a513b5a291c63" gracePeriod=30 Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.918672 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="sg-core" containerID="cri-o://d07a1d9b916e4f3e7a8a1402794315d10d0fa212b37288654a33188aff743885" gracePeriod=30 Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.918721 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="ceilometer-notification-agent" containerID="cri-o://463c890cb672987e4db62f57b14305282dced80284ec2842a2e3a25befe23bf9" gracePeriod=30 Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.920549 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vqrmb" event={"ID":"d0772ac7-3374-4607-a644-f4ac2e1c078a","Type":"ContainerStarted","Data":"56926e28702f7f49449b25045bd4430aca71c4abfb7465c1932db4f3abec35bc"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.929379 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4e9f-account-create-update-sdhth" event={"ID":"d13e59b2-0b15-4b7f-b158-ea16ec2b5416","Type":"ContainerStarted","Data":"b398660f408eb077ec37e46aac34f95a01068c141577a940f5d64dfc4dc0b027"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.929424 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4e9f-account-create-update-sdhth" event={"ID":"d13e59b2-0b15-4b7f-b158-ea16ec2b5416","Type":"ContainerStarted","Data":"e4805d6955b6d3e0ebc12d0484bdd410741675cd4a31046222f6b6bd45082c68"} Jan 29 17:08:21 crc kubenswrapper[4886]: I0129 17:08:21.960667 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.028104266 podStartE2EDuration="18.960649395s" podCreationTimestamp="2026-01-29 17:08:03 +0000 UTC" firstStartedPulling="2026-01-29 17:08:06.648673064 +0000 UTC m=+2769.557392336" lastFinishedPulling="2026-01-29 17:08:20.581218193 +0000 UTC m=+2783.489937465" observedRunningTime="2026-01-29 17:08:21.948494192 +0000 UTC m=+2784.857213464" watchObservedRunningTime="2026-01-29 17:08:21.960649395 +0000 UTC m=+2784.869368667" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.224854 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.225147 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.225644 4886 scope.go:117] "RemoveContainer" containerID="269b4adc6e6be10392170084dc412e856cfe62aa07302ce9122a8ed94105dabe" Jan 29 17:08:22 crc kubenswrapper[4886]: E0129 17:08:22.225972 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54985c87ff-g5725_openstack(04a4a757-71c6-46ec-9019-8d2f64be8285)\"" pod="openstack/heat-cfnapi-54985c87ff-g5725" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.237826 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.237874 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.238787 4886 scope.go:117] "RemoveContainer" containerID="2eb9aac70b8d95e0c6e925aa406b960e03929e9d6915153ce56a560a835d977d" Jan 29 17:08:22 crc kubenswrapper[4886]: E0129 17:08:22.239023 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6c7bddd46c-bnlxj_openstack(7b6ce536-47ec-45b9-b926-28f1fa7eb80a)\"" pod="openstack/heat-api-6c7bddd46c-bnlxj" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.312445 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.431882 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-scripts\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.432078 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-httpd-run\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.432588 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "849de0d3-3456-44c2-bef4-3a435e4a432a" (UID: "849de0d3-3456-44c2-bef4-3a435e4a432a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.432899 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.432986 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fglvx\" (UniqueName: \"kubernetes.io/projected/849de0d3-3456-44c2-bef4-3a435e4a432a-kube-api-access-fglvx\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.433019 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-public-tls-certs\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.433035 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-combined-ca-bundle\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.433062 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-logs\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.433110 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-config-data\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.433710 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.436546 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-logs" (OuterVolumeSpecName: "logs") pod "849de0d3-3456-44c2-bef4-3a435e4a432a" (UID: "849de0d3-3456-44c2-bef4-3a435e4a432a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.442988 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849de0d3-3456-44c2-bef4-3a435e4a432a-kube-api-access-fglvx" (OuterVolumeSpecName: "kube-api-access-fglvx") pod "849de0d3-3456-44c2-bef4-3a435e4a432a" (UID: "849de0d3-3456-44c2-bef4-3a435e4a432a"). InnerVolumeSpecName "kube-api-access-fglvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.473350 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-scripts" (OuterVolumeSpecName: "scripts") pod "849de0d3-3456-44c2-bef4-3a435e4a432a" (UID: "849de0d3-3456-44c2-bef4-3a435e4a432a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.537000 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.537030 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fglvx\" (UniqueName: \"kubernetes.io/projected/849de0d3-3456-44c2-bef4-3a435e4a432a-kube-api-access-fglvx\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.537041 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849de0d3-3456-44c2-bef4-3a435e4a432a-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.641434 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1" (OuterVolumeSpecName: "glance") pod "849de0d3-3456-44c2-bef4-3a435e4a432a" (UID: "849de0d3-3456-44c2-bef4-3a435e4a432a"). InnerVolumeSpecName "pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 17:08:22 crc kubenswrapper[4886]: E0129 17:08:22.642147 4886 reconciler_common.go:156] "operationExecutor.UnmountVolume failed (controllerAttachDetachEnabled true) for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") : UnmountVolume.NewUnmounter failed for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/849de0d3-3456-44c2-bef4-3a435e4a432a/volumes/kubernetes.io~csi/pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/849de0d3-3456-44c2-bef4-3a435e4a432a/volumes/kubernetes.io~csi/pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1/vol_data.json]: open /var/lib/kubelet/pods/849de0d3-3456-44c2-bef4-3a435e4a432a/volumes/kubernetes.io~csi/pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1/vol_data.json: no such file or directory" err="UnmountVolume.NewUnmounter failed for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"849de0d3-3456-44c2-bef4-3a435e4a432a\" (UID: \"849de0d3-3456-44c2-bef4-3a435e4a432a\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/849de0d3-3456-44c2-bef4-3a435e4a432a/volumes/kubernetes.io~csi/pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/849de0d3-3456-44c2-bef4-3a435e4a432a/volumes/kubernetes.io~csi/pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1/vol_data.json]: open /var/lib/kubelet/pods/849de0d3-3456-44c2-bef4-3a435e4a432a/volumes/kubernetes.io~csi/pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1/vol_data.json: no such file or directory" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.643163 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") on node \"crc\" " Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.704180 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "849de0d3-3456-44c2-bef4-3a435e4a432a" (UID: "849de0d3-3456-44c2-bef4-3a435e4a432a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.748233 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.760972 4886 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.762422 4886 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1") on node "crc" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.801988 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-config-data" (OuterVolumeSpecName: "config-data") pod "849de0d3-3456-44c2-bef4-3a435e4a432a" (UID: "849de0d3-3456-44c2-bef4-3a435e4a432a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.852658 4886 reconciler_common.go:293] "Volume detached for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.852700 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.877644 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "849de0d3-3456-44c2-bef4-3a435e4a432a" (UID: "849de0d3-3456-44c2-bef4-3a435e4a432a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.954605 4886 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/849de0d3-3456-44c2-bef4-3a435e4a432a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.972994 4886 generic.go:334] "Generic (PLEG): container finished" podID="ea6c4698-f001-402f-91e3-1e80bc7bf443" containerID="92b4d1b2f475024d893ea29a83366ecc7f80ef2e9282821adbce174622472058" exitCode=0 Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.973070 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n9fr6" event={"ID":"ea6c4698-f001-402f-91e3-1e80bc7bf443","Type":"ContainerDied","Data":"92b4d1b2f475024d893ea29a83366ecc7f80ef2e9282821adbce174622472058"} Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.988109 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" event={"ID":"6af00928-6484-4071-b739-bc211ac220ef","Type":"ContainerStarted","Data":"e03fdcc391c686ad6f7c447bf2012b345cc1a12adaddfc3b0b7fbabe7adbed61"} Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.999001 4886 generic.go:334] "Generic (PLEG): container finished" podID="18c5f721-30d1-48de-97e4-52399587c9d1" containerID="afb5da406ee3b16e59af7913d87b7d9742dbcfd595f22b00884d57064f6bdef1" exitCode=0 Jan 29 17:08:22 crc kubenswrapper[4886]: I0129 17:08:22.999066 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vflxs" event={"ID":"18c5f721-30d1-48de-97e4-52399587c9d1","Type":"ContainerDied","Data":"afb5da406ee3b16e59af7913d87b7d9742dbcfd595f22b00884d57064f6bdef1"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.011256 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"849de0d3-3456-44c2-bef4-3a435e4a432a","Type":"ContainerDied","Data":"6c945ea15f303c81064b58dfa01521088d6d511849d81e35019f4fd66c782c28"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.011302 4886 scope.go:117] "RemoveContainer" containerID="5e2f27254ecaeae6872715e18449eaa22b877597c8124da7a49920ec97100c5d" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.011451 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.015034 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" podStartSLOduration=4.015020022 podStartE2EDuration="4.015020022s" podCreationTimestamp="2026-01-29 17:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:23.005218226 +0000 UTC m=+2785.913937508" watchObservedRunningTime="2026-01-29 17:08:23.015020022 +0000 UTC m=+2785.923739294" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.025025 4886 generic.go:334] "Generic (PLEG): container finished" podID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerID="97f8f5e0387fde773bf154bf18b428f934c3b6dd32a6b73bb76a513b5a291c63" exitCode=0 Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.025057 4886 generic.go:334] "Generic (PLEG): container finished" podID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerID="d07a1d9b916e4f3e7a8a1402794315d10d0fa212b37288654a33188aff743885" exitCode=2 Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.025065 4886 generic.go:334] "Generic (PLEG): container finished" podID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerID="463c890cb672987e4db62f57b14305282dced80284ec2842a2e3a25befe23bf9" exitCode=0 Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.025071 4886 generic.go:334] "Generic (PLEG): container finished" podID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerID="5d0ddc2798e73cd33929ee945c72ef848dc6759a75fd9fcc95c2f939f265b877" exitCode=0 Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.025118 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerDied","Data":"97f8f5e0387fde773bf154bf18b428f934c3b6dd32a6b73bb76a513b5a291c63"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.025705 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerDied","Data":"d07a1d9b916e4f3e7a8a1402794315d10d0fa212b37288654a33188aff743885"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.025719 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerDied","Data":"463c890cb672987e4db62f57b14305282dced80284ec2842a2e3a25befe23bf9"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.025727 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerDied","Data":"5d0ddc2798e73cd33929ee945c72ef848dc6759a75fd9fcc95c2f939f265b877"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.035905 4886 generic.go:334] "Generic (PLEG): container finished" podID="d0772ac7-3374-4607-a644-f4ac2e1c078a" containerID="e75acdd55522e91761ce2d771dbc17900e4f53d297811cf9623f07bc70ba7052" exitCode=0 Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.035985 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vqrmb" event={"ID":"d0772ac7-3374-4607-a644-f4ac2e1c078a","Type":"ContainerDied","Data":"e75acdd55522e91761ce2d771dbc17900e4f53d297811cf9623f07bc70ba7052"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.041704 4886 generic.go:334] "Generic (PLEG): container finished" podID="d13e59b2-0b15-4b7f-b158-ea16ec2b5416" containerID="b398660f408eb077ec37e46aac34f95a01068c141577a940f5d64dfc4dc0b027" exitCode=0 Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.041768 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4e9f-account-create-update-sdhth" event={"ID":"d13e59b2-0b15-4b7f-b158-ea16ec2b5416","Type":"ContainerDied","Data":"b398660f408eb077ec37e46aac34f95a01068c141577a940f5d64dfc4dc0b027"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.054487 4886 generic.go:334] "Generic (PLEG): container finished" podID="8258df8a-fd9a-4546-8ea7-ce4b7f7180bb" containerID="55979afc492dd3730aa23e20e090c57835e6091af47e18bbcd87fee5afa8dde9" exitCode=0 Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.054936 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" event={"ID":"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb","Type":"ContainerDied","Data":"55979afc492dd3730aa23e20e090c57835e6091af47e18bbcd87fee5afa8dde9"} Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.137544 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.163194 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-sg-core-conf-yaml\") pod \"e0ea79fe-a2e5-4861-be91-aba220b1b221\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.163346 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-log-httpd\") pod \"e0ea79fe-a2e5-4861-be91-aba220b1b221\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.163411 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-scripts\") pod \"e0ea79fe-a2e5-4861-be91-aba220b1b221\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.163478 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-run-httpd\") pod \"e0ea79fe-a2e5-4861-be91-aba220b1b221\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.163576 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt8hq\" (UniqueName: \"kubernetes.io/projected/e0ea79fe-a2e5-4861-be91-aba220b1b221-kube-api-access-rt8hq\") pod \"e0ea79fe-a2e5-4861-be91-aba220b1b221\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.163612 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-combined-ca-bundle\") pod \"e0ea79fe-a2e5-4861-be91-aba220b1b221\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.163755 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-config-data\") pod \"e0ea79fe-a2e5-4861-be91-aba220b1b221\" (UID: \"e0ea79fe-a2e5-4861-be91-aba220b1b221\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.165352 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e0ea79fe-a2e5-4861-be91-aba220b1b221" (UID: "e0ea79fe-a2e5-4861-be91-aba220b1b221"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.167042 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.175843 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e0ea79fe-a2e5-4861-be91-aba220b1b221" (UID: "e0ea79fe-a2e5-4861-be91-aba220b1b221"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.179631 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ea79fe-a2e5-4861-be91-aba220b1b221-kube-api-access-rt8hq" (OuterVolumeSpecName: "kube-api-access-rt8hq") pod "e0ea79fe-a2e5-4861-be91-aba220b1b221" (UID: "e0ea79fe-a2e5-4861-be91-aba220b1b221"). InnerVolumeSpecName "kube-api-access-rt8hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.199143 4886 scope.go:117] "RemoveContainer" containerID="685691dd71892e3462a49d43e961e4398610edbd2ff6858db714971fb73711e6" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.199258 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.204253 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-scripts" (OuterVolumeSpecName: "scripts") pod "e0ea79fe-a2e5-4861-be91-aba220b1b221" (UID: "e0ea79fe-a2e5-4861-be91-aba220b1b221"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.265461 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:08:23 crc kubenswrapper[4886]: E0129 17:08:23.267277 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="proxy-httpd" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267297 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="proxy-httpd" Jan 29 17:08:23 crc kubenswrapper[4886]: E0129 17:08:23.267355 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="ceilometer-notification-agent" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267364 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="ceilometer-notification-agent" Jan 29 17:08:23 crc kubenswrapper[4886]: E0129 17:08:23.267382 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="sg-core" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267389 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="sg-core" Jan 29 17:08:23 crc kubenswrapper[4886]: E0129 17:08:23.267406 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerName="glance-httpd" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267414 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerName="glance-httpd" Jan 29 17:08:23 crc kubenswrapper[4886]: E0129 17:08:23.267434 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerName="glance-log" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267442 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerName="glance-log" Jan 29 17:08:23 crc kubenswrapper[4886]: E0129 17:08:23.267460 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="ceilometer-central-agent" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267468 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="ceilometer-central-agent" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267745 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="ceilometer-central-agent" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267762 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="sg-core" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267779 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="ceilometer-notification-agent" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267794 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerName="glance-log" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267802 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" containerName="glance-httpd" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.267822 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" containerName="proxy-httpd" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.268410 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.268439 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.268471 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0ea79fe-a2e5-4861-be91-aba220b1b221-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.268537 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt8hq\" (UniqueName: \"kubernetes.io/projected/e0ea79fe-a2e5-4861-be91-aba220b1b221-kube-api-access-rt8hq\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.269797 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.271881 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.272088 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.291947 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e0ea79fe-a2e5-4861-be91-aba220b1b221" (UID: "e0ea79fe-a2e5-4861-be91-aba220b1b221"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.295684 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.347200 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0ea79fe-a2e5-4861-be91-aba220b1b221" (UID: "e0ea79fe-a2e5-4861-be91-aba220b1b221"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.364813 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-config-data" (OuterVolumeSpecName: "config-data") pod "e0ea79fe-a2e5-4861-be91-aba220b1b221" (UID: "e0ea79fe-a2e5-4861-be91-aba220b1b221"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.375084 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-logs\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.375408 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.375532 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nmlh\" (UniqueName: \"kubernetes.io/projected/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-kube-api-access-7nmlh\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.375627 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.375781 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.375798 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.375848 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.375912 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.376053 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.376065 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.376074 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea79fe-a2e5-4861-be91-aba220b1b221-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.478444 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.478530 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nmlh\" (UniqueName: \"kubernetes.io/projected/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-kube-api-access-7nmlh\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.478607 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.478692 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.478714 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.478748 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.478796 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.478853 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-logs\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.481128 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.482258 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.482289 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9fc1bf04f61733e1543e4c6d32069c38c610c3d0fa9a349fa6a409f3542d3c50/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.485483 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.485819 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-logs\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.488669 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.489555 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.490247 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.518392 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nmlh\" (UniqueName: \"kubernetes.io/projected/2dbf03ea-9df9-4f03-aee9-113dabed1c7a-kube-api-access-7nmlh\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.549255 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.576563 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a580962-e55c-4bdc-ba31-c39bc4f20fb1\") pod \"glance-default-external-api-0\" (UID: \"2dbf03ea-9df9-4f03-aee9-113dabed1c7a\") " pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.580123 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea6c4698-f001-402f-91e3-1e80bc7bf443-operator-scripts\") pod \"ea6c4698-f001-402f-91e3-1e80bc7bf443\" (UID: \"ea6c4698-f001-402f-91e3-1e80bc7bf443\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.580291 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxnqt\" (UniqueName: \"kubernetes.io/projected/ea6c4698-f001-402f-91e3-1e80bc7bf443-kube-api-access-gxnqt\") pod \"ea6c4698-f001-402f-91e3-1e80bc7bf443\" (UID: \"ea6c4698-f001-402f-91e3-1e80bc7bf443\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.581477 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6c4698-f001-402f-91e3-1e80bc7bf443-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea6c4698-f001-402f-91e3-1e80bc7bf443" (UID: "ea6c4698-f001-402f-91e3-1e80bc7bf443"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.590225 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6c4698-f001-402f-91e3-1e80bc7bf443-kube-api-access-gxnqt" (OuterVolumeSpecName: "kube-api-access-gxnqt") pod "ea6c4698-f001-402f-91e3-1e80bc7bf443" (UID: "ea6c4698-f001-402f-91e3-1e80bc7bf443"). InnerVolumeSpecName "kube-api-access-gxnqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.591699 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.684094 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea6c4698-f001-402f-91e3-1e80bc7bf443-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.684859 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxnqt\" (UniqueName: \"kubernetes.io/projected/ea6c4698-f001-402f-91e3-1e80bc7bf443-kube-api-access-gxnqt\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.967978 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.979642 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.995292 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-operator-scripts\") pod \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\" (UID: \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.995448 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9n6n\" (UniqueName: \"kubernetes.io/projected/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-kube-api-access-r9n6n\") pod \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\" (UID: \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.995486 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-operator-scripts\") pod \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\" (UID: \"d13e59b2-0b15-4b7f-b158-ea16ec2b5416\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.995590 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkt66\" (UniqueName: \"kubernetes.io/projected/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-kube-api-access-pkt66\") pod \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\" (UID: \"0abefc39-4eb0-4600-8e11-b5d4af3c11b4\") " Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.995957 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0abefc39-4eb0-4600-8e11-b5d4af3c11b4" (UID: "0abefc39-4eb0-4600-8e11-b5d4af3c11b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.996644 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:23 crc kubenswrapper[4886]: I0129 17:08:23.996893 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d13e59b2-0b15-4b7f-b158-ea16ec2b5416" (UID: "d13e59b2-0b15-4b7f-b158-ea16ec2b5416"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.003856 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-kube-api-access-pkt66" (OuterVolumeSpecName: "kube-api-access-pkt66") pod "0abefc39-4eb0-4600-8e11-b5d4af3c11b4" (UID: "0abefc39-4eb0-4600-8e11-b5d4af3c11b4"). InnerVolumeSpecName "kube-api-access-pkt66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.005266 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-kube-api-access-r9n6n" (OuterVolumeSpecName: "kube-api-access-r9n6n") pod "d13e59b2-0b15-4b7f-b158-ea16ec2b5416" (UID: "d13e59b2-0b15-4b7f-b158-ea16ec2b5416"). InnerVolumeSpecName "kube-api-access-r9n6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.069216 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n9fr6" event={"ID":"ea6c4698-f001-402f-91e3-1e80bc7bf443","Type":"ContainerDied","Data":"97aa039de70a06170f71988b76c9396909f3b7178da4b75eb9a0fd7d820bb21d"} Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.069259 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97aa039de70a06170f71988b76c9396909f3b7178da4b75eb9a0fd7d820bb21d" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.069342 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n9fr6" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.077335 4886 generic.go:334] "Generic (PLEG): container finished" podID="6af00928-6484-4071-b739-bc211ac220ef" containerID="e03fdcc391c686ad6f7c447bf2012b345cc1a12adaddfc3b0b7fbabe7adbed61" exitCode=0 Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.077397 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" event={"ID":"6af00928-6484-4071-b739-bc211ac220ef","Type":"ContainerDied","Data":"e03fdcc391c686ad6f7c447bf2012b345cc1a12adaddfc3b0b7fbabe7adbed61"} Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.093817 4886 generic.go:334] "Generic (PLEG): container finished" podID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerID="819d3c493df902007da456da0899d275e457a2f0ed2e48aedaf84f652820cb61" exitCode=0 Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.093885 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf","Type":"ContainerDied","Data":"819d3c493df902007da456da0899d275e457a2f0ed2e48aedaf84f652820cb61"} Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.098805 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9n6n\" (UniqueName: \"kubernetes.io/projected/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-kube-api-access-r9n6n\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.098833 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e59b2-0b15-4b7f-b158-ea16ec2b5416-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.098843 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkt66\" (UniqueName: \"kubernetes.io/projected/0abefc39-4eb0-4600-8e11-b5d4af3c11b4-kube-api-access-pkt66\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.102962 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.112658 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0ea79fe-a2e5-4861-be91-aba220b1b221","Type":"ContainerDied","Data":"928834e62ea2e840bea0af8f378a7be863b8582e831ecb530090b696cd7380b1"} Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.112726 4886 scope.go:117] "RemoveContainer" containerID="97f8f5e0387fde773bf154bf18b428f934c3b6dd32a6b73bb76a513b5a291c63" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.112891 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.135848 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4e9f-account-create-update-sdhth" event={"ID":"d13e59b2-0b15-4b7f-b158-ea16ec2b5416","Type":"ContainerDied","Data":"e4805d6955b6d3e0ebc12d0484bdd410741675cd4a31046222f6b6bd45082c68"} Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.135876 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4805d6955b6d3e0ebc12d0484bdd410741675cd4a31046222f6b6bd45082c68" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.139678 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4e9f-account-create-update-sdhth" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.151697 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6jmdx" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.152452 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6jmdx" event={"ID":"0abefc39-4eb0-4600-8e11-b5d4af3c11b4","Type":"ContainerDied","Data":"1d6d2eb795c39ee31f6bd0a881882b56df9889d142ea82ed82c62281b1f67996"} Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.152491 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d6d2eb795c39ee31f6bd0a881882b56df9889d142ea82ed82c62281b1f67996" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.216105 4886 scope.go:117] "RemoveContainer" containerID="d07a1d9b916e4f3e7a8a1402794315d10d0fa212b37288654a33188aff743885" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.235402 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.252093 4886 scope.go:117] "RemoveContainer" containerID="463c890cb672987e4db62f57b14305282dced80284ec2842a2e3a25befe23bf9" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.291080 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.296098 4886 scope.go:117] "RemoveContainer" containerID="5d0ddc2798e73cd33929ee945c72ef848dc6759a75fd9fcc95c2f939f265b877" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.318218 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:24 crc kubenswrapper[4886]: E0129 17:08:24.320194 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abefc39-4eb0-4600-8e11-b5d4af3c11b4" containerName="mariadb-database-create" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.320424 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abefc39-4eb0-4600-8e11-b5d4af3c11b4" containerName="mariadb-database-create" Jan 29 17:08:24 crc kubenswrapper[4886]: E0129 17:08:24.320487 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13e59b2-0b15-4b7f-b158-ea16ec2b5416" containerName="mariadb-account-create-update" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.320504 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13e59b2-0b15-4b7f-b158-ea16ec2b5416" containerName="mariadb-account-create-update" Jan 29 17:08:24 crc kubenswrapper[4886]: E0129 17:08:24.320593 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6c4698-f001-402f-91e3-1e80bc7bf443" containerName="mariadb-database-create" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.320606 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6c4698-f001-402f-91e3-1e80bc7bf443" containerName="mariadb-database-create" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.321338 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0abefc39-4eb0-4600-8e11-b5d4af3c11b4" containerName="mariadb-database-create" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.321369 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13e59b2-0b15-4b7f-b158-ea16ec2b5416" containerName="mariadb-account-create-update" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.321384 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6c4698-f001-402f-91e3-1e80bc7bf443" containerName="mariadb-database-create" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.325709 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.331586 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.331780 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.350182 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.516531 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-run-httpd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.516668 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.516719 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.516869 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-log-httpd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.516927 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmdsd\" (UniqueName: \"kubernetes.io/projected/291e8ff3-6792-4900-86a1-df3730548041-kube-api-access-hmdsd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.516997 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-config-data\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.517120 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-scripts\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.624881 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-log-httpd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.624955 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmdsd\" (UniqueName: \"kubernetes.io/projected/291e8ff3-6792-4900-86a1-df3730548041-kube-api-access-hmdsd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.625005 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-config-data\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.625133 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-scripts\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.625193 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-run-httpd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.625269 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.625965 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-log-httpd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.626941 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.636048 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.636290 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-run-httpd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.637728 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-config-data\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.646642 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.647219 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-scripts\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.651608 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmdsd\" (UniqueName: \"kubernetes.io/projected/291e8ff3-6792-4900-86a1-df3730548041-kube-api-access-hmdsd\") pod \"ceilometer-0\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.671565 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.672912 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849de0d3-3456-44c2-bef4-3a435e4a432a" path="/var/lib/kubelet/pods/849de0d3-3456-44c2-bef4-3a435e4a432a/volumes" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.676221 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ea79fe-a2e5-4861-be91-aba220b1b221" path="/var/lib/kubelet/pods/e0ea79fe-a2e5-4861-be91-aba220b1b221/volumes" Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.861023 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 17:08:24 crc kubenswrapper[4886]: I0129 17:08:24.865383 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.045276 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-scripts\") pod \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.045556 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-internal-tls-certs\") pod \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.045599 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhpzr\" (UniqueName: \"kubernetes.io/projected/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-kube-api-access-fhpzr\") pod \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.045724 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-httpd-run\") pod \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.045775 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-config-data\") pod \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.045877 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-combined-ca-bundle\") pod \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.045930 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-logs\") pod \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.058699 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-logs" (OuterVolumeSpecName: "logs") pod "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" (UID: "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.058841 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\" (UID: \"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.064807 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.063393 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" (UID: "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.143774 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-scripts" (OuterVolumeSpecName: "scripts") pod "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" (UID: "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.143883 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-kube-api-access-fhpzr" (OuterVolumeSpecName: "kube-api-access-fhpzr") pod "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" (UID: "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf"). InnerVolumeSpecName "kube-api-access-fhpzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.143948 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019" (OuterVolumeSpecName: "glance") pod "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" (UID: "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf"). InnerVolumeSpecName "pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.146934 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" (UID: "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.171997 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") on node \"crc\" " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.172059 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.172086 4886 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.172098 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhpzr\" (UniqueName: \"kubernetes.io/projected/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-kube-api-access-fhpzr\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.172107 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.191067 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vqrmb" event={"ID":"d0772ac7-3374-4607-a644-f4ac2e1c078a","Type":"ContainerDied","Data":"56926e28702f7f49449b25045bd4430aca71c4abfb7465c1932db4f3abec35bc"} Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.191116 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56926e28702f7f49449b25045bd4430aca71c4abfb7465c1932db4f3abec35bc" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.208008 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" (UID: "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.211079 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" event={"ID":"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb","Type":"ContainerDied","Data":"e1eabc32a80d150906ee8042c9b91dd9d3a691eb3e8f2321170f2610258d0695"} Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.211116 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1eabc32a80d150906ee8042c9b91dd9d3a691eb3e8f2321170f2610258d0695" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.219250 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbf03ea-9df9-4f03-aee9-113dabed1c7a","Type":"ContainerStarted","Data":"7ae008cfe708205b4ec455c74e5866300c590e18ba606d283c32108e0e208c62"} Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.227115 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-config-data" (OuterVolumeSpecName: "config-data") pod "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" (UID: "16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.240768 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vflxs" event={"ID":"18c5f721-30d1-48de-97e4-52399587c9d1","Type":"ContainerStarted","Data":"62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903"} Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.279055 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.279094 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.279234 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.279621 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf","Type":"ContainerDied","Data":"71bc8d6cf1178c38541a40863263406b012b61b297b4f5183d44e11e56405a8a"} Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.279728 4886 scope.go:117] "RemoveContainer" containerID="819d3c493df902007da456da0899d275e457a2f0ed2e48aedaf84f652820cb61" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.279914 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.282260 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.282555 4886 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.282667 4886 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019") on node "crc" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.314215 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vflxs" podStartSLOduration=8.333269489 podStartE2EDuration="13.31419847s" podCreationTimestamp="2026-01-29 17:08:12 +0000 UTC" firstStartedPulling="2026-01-29 17:08:18.599067094 +0000 UTC m=+2781.507786356" lastFinishedPulling="2026-01-29 17:08:23.579996065 +0000 UTC m=+2786.488715337" observedRunningTime="2026-01-29 17:08:25.275440358 +0000 UTC m=+2788.184159640" watchObservedRunningTime="2026-01-29 17:08:25.31419847 +0000 UTC m=+2788.222917742" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.380061 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-operator-scripts\") pod \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\" (UID: \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.380505 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtmbn\" (UniqueName: \"kubernetes.io/projected/d0772ac7-3374-4607-a644-f4ac2e1c078a-kube-api-access-jtmbn\") pod \"d0772ac7-3374-4607-a644-f4ac2e1c078a\" (UID: \"d0772ac7-3374-4607-a644-f4ac2e1c078a\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.380535 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0772ac7-3374-4607-a644-f4ac2e1c078a-operator-scripts\") pod \"d0772ac7-3374-4607-a644-f4ac2e1c078a\" (UID: \"d0772ac7-3374-4607-a644-f4ac2e1c078a\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.380580 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlwld\" (UniqueName: \"kubernetes.io/projected/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-kube-api-access-tlwld\") pod \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\" (UID: \"8258df8a-fd9a-4546-8ea7-ce4b7f7180bb\") " Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.380658 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8258df8a-fd9a-4546-8ea7-ce4b7f7180bb" (UID: "8258df8a-fd9a-4546-8ea7-ce4b7f7180bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.385596 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0772ac7-3374-4607-a644-f4ac2e1c078a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0772ac7-3374-4607-a644-f4ac2e1c078a" (UID: "d0772ac7-3374-4607-a644-f4ac2e1c078a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.391544 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0772ac7-3374-4607-a644-f4ac2e1c078a-kube-api-access-jtmbn" (OuterVolumeSpecName: "kube-api-access-jtmbn") pod "d0772ac7-3374-4607-a644-f4ac2e1c078a" (UID: "d0772ac7-3374-4607-a644-f4ac2e1c078a"). InnerVolumeSpecName "kube-api-access-jtmbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.391686 4886 scope.go:117] "RemoveContainer" containerID="d46a9e5456f252ab3dd8ef0ca224f83e7f91449851fd433a23e9070eb20e028e" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.393320 4886 reconciler_common.go:293] "Volume detached for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.393385 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.393398 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtmbn\" (UniqueName: \"kubernetes.io/projected/d0772ac7-3374-4607-a644-f4ac2e1c078a-kube-api-access-jtmbn\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.393411 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0772ac7-3374-4607-a644-f4ac2e1c078a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.403617 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-kube-api-access-tlwld" (OuterVolumeSpecName: "kube-api-access-tlwld") pod "8258df8a-fd9a-4546-8ea7-ce4b7f7180bb" (UID: "8258df8a-fd9a-4546-8ea7-ce4b7f7180bb"). InnerVolumeSpecName "kube-api-access-tlwld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.441911 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.453825 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.465376 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:08:25 crc kubenswrapper[4886]: E0129 17:08:25.466213 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerName="glance-httpd" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.466228 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerName="glance-httpd" Jan 29 17:08:25 crc kubenswrapper[4886]: E0129 17:08:25.466239 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0772ac7-3374-4607-a644-f4ac2e1c078a" containerName="mariadb-database-create" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.466246 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0772ac7-3374-4607-a644-f4ac2e1c078a" containerName="mariadb-database-create" Jan 29 17:08:25 crc kubenswrapper[4886]: E0129 17:08:25.466298 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerName="glance-log" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.466306 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerName="glance-log" Jan 29 17:08:25 crc kubenswrapper[4886]: E0129 17:08:25.466316 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8258df8a-fd9a-4546-8ea7-ce4b7f7180bb" containerName="mariadb-account-create-update" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.466353 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8258df8a-fd9a-4546-8ea7-ce4b7f7180bb" containerName="mariadb-account-create-update" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.466648 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerName="glance-log" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.466699 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8258df8a-fd9a-4546-8ea7-ce4b7f7180bb" containerName="mariadb-account-create-update" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.466719 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" containerName="glance-httpd" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.466767 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0772ac7-3374-4607-a644-f4ac2e1c078a" containerName="mariadb-database-create" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.468423 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.473511 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.473745 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.496181 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlwld\" (UniqueName: \"kubernetes.io/projected/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb-kube-api-access-tlwld\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.503732 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.562839 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.608532 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.609153 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.609644 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.610018 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.610069 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.610148 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81437be4-b399-40e9-9c33-e71319326af8-logs\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.610275 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4n7\" (UniqueName: \"kubernetes.io/projected/81437be4-b399-40e9-9c33-e71319326af8-kube-api-access-2m4n7\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.614600 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81437be4-b399-40e9-9c33-e71319326af8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.717622 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81437be4-b399-40e9-9c33-e71319326af8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.717684 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.717902 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.717946 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.717998 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.718023 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.718067 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81437be4-b399-40e9-9c33-e71319326af8-logs\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.718114 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4n7\" (UniqueName: \"kubernetes.io/projected/81437be4-b399-40e9-9c33-e71319326af8-kube-api-access-2m4n7\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.721217 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81437be4-b399-40e9-9c33-e71319326af8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.733238 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81437be4-b399-40e9-9c33-e71319326af8-logs\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.736434 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.736462 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b71ee9dc20b2cd8e0489051d74fcf4864cc02a892819f8a5785e080087446e/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.747626 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.749028 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.751623 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.756054 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81437be4-b399-40e9-9c33-e71319326af8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:25 crc kubenswrapper[4886]: I0129 17:08:25.789046 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4n7\" (UniqueName: \"kubernetes.io/projected/81437be4-b399-40e9-9c33-e71319326af8-kube-api-access-2m4n7\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.033571 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3be0811-27ce-4f01-a1ee-e88ee60ba019\") pod \"glance-default-internal-api-0\" (UID: \"81437be4-b399-40e9-9c33-e71319326af8\") " pod="openstack/glance-default-internal-api-0" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.110113 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.196081 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.338193 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk62r\" (UniqueName: \"kubernetes.io/projected/6af00928-6484-4071-b739-bc211ac220ef-kube-api-access-pk62r\") pod \"6af00928-6484-4071-b739-bc211ac220ef\" (UID: \"6af00928-6484-4071-b739-bc211ac220ef\") " Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.338900 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af00928-6484-4071-b739-bc211ac220ef-operator-scripts\") pod \"6af00928-6484-4071-b739-bc211ac220ef\" (UID: \"6af00928-6484-4071-b739-bc211ac220ef\") " Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.340058 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6af00928-6484-4071-b739-bc211ac220ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6af00928-6484-4071-b739-bc211ac220ef" (UID: "6af00928-6484-4071-b739-bc211ac220ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.358566 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af00928-6484-4071-b739-bc211ac220ef-kube-api-access-pk62r" (OuterVolumeSpecName: "kube-api-access-pk62r") pod "6af00928-6484-4071-b739-bc211ac220ef" (UID: "6af00928-6484-4071-b739-bc211ac220ef"). InnerVolumeSpecName "kube-api-access-pk62r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.379562 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerStarted","Data":"e9683c7a0a1e9a4a4afcaf55416c4d002525f6149a721a2eb46199347f8c0103"} Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.380515 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-846d49f49c-kc98b" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.396258 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" event={"ID":"6af00928-6484-4071-b739-bc211ac220ef","Type":"ContainerDied","Data":"91c7222c3b9f7d5be92754c25f343aeff5c1732b0217924a2ad1edc9eaf57e78"} Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.396296 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91c7222c3b9f7d5be92754c25f343aeff5c1732b0217924a2ad1edc9eaf57e78" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.396495 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc0e-account-create-update-nxk7k" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.405152 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9c8-account-create-update-hcc42" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.406090 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vqrmb" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.480730 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af00928-6484-4071-b739-bc211ac220ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.480762 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk62r\" (UniqueName: \"kubernetes.io/projected/6af00928-6484-4071-b739-bc211ac220ef-kube-api-access-pk62r\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.500754 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7854df7c4b-dn4j7"] Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.501157 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7854df7c4b-dn4j7" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerName="neutron-api" containerID="cri-o://75e8cf0cad7d6d59d88f3f3bd6a97cab33d3691af01126d62cdae48b3d82240f" gracePeriod=30 Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.501389 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7854df7c4b-dn4j7" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerName="neutron-httpd" containerID="cri-o://f3ee0a56aaca61cef2419de911db690ccd8876c78a545e2b8864e16aa4ff333a" gracePeriod=30 Jan 29 17:08:26 crc kubenswrapper[4886]: I0129 17:08:26.645683 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf" path="/var/lib/kubelet/pods/16c3788a-e2f7-4af4-8c2e-dc5aad6f3dbf/volumes" Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.254696 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5f6fd667fd-4s5hk" Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.281375 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.329165 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-54f8bbfbf-9qjxm"] Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.329373 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-54f8bbfbf-9qjxm" podUID="92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" containerName="heat-engine" containerID="cri-o://b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" gracePeriod=60 Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.473976 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbf03ea-9df9-4f03-aee9-113dabed1c7a","Type":"ContainerStarted","Data":"ef497bed49a4a288b6a5bb91a3f5de21fdb4d87b94282ea416c9156beaf4f5d8"} Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.492230 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81437be4-b399-40e9-9c33-e71319326af8","Type":"ContainerStarted","Data":"28b98de75177e4384713e1d50e58b8c51918e7f32830394947da1871c49de6bb"} Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.509676 4886 generic.go:334] "Generic (PLEG): container finished" podID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerID="f3ee0a56aaca61cef2419de911db690ccd8876c78a545e2b8864e16aa4ff333a" exitCode=0 Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.509750 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7854df7c4b-dn4j7" event={"ID":"0ff8b641-0d76-41ce-b6ac-7d708effebc0","Type":"ContainerDied","Data":"f3ee0a56aaca61cef2419de911db690ccd8876c78a545e2b8864e16aa4ff333a"} Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.546823 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerStarted","Data":"3ddc8827ee40ed9c34df4f01749ce22387bf3f776bb544ffddfacdc88b3c01b2"} Jan 29 17:08:27 crc kubenswrapper[4886]: I0129 17:08:27.982740 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7c65449fdf-42rxg" Jan 29 17:08:28 crc kubenswrapper[4886]: I0129 17:08:28.078783 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-64bb5bfdfc-h2mgd" Jan 29 17:08:28 crc kubenswrapper[4886]: I0129 17:08:28.086940 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-54985c87ff-g5725"] Jan 29 17:08:28 crc kubenswrapper[4886]: I0129 17:08:28.274918 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6c7bddd46c-bnlxj"] Jan 29 17:08:28 crc kubenswrapper[4886]: I0129 17:08:28.633128 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbf03ea-9df9-4f03-aee9-113dabed1c7a","Type":"ContainerStarted","Data":"0d92f69f91eee5fa6fac8149c03fa945659bdf95b999773a4673f1504dac0060"} Jan 29 17:08:28 crc kubenswrapper[4886]: I0129 17:08:28.673238 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81437be4-b399-40e9-9c33-e71319326af8","Type":"ContainerStarted","Data":"df0462edbf1213821887b3f3e0e071cded45cf21e034be49f29377b0f167d78e"} Jan 29 17:08:28 crc kubenswrapper[4886]: I0129 17:08:28.673275 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerStarted","Data":"16b1fa849040aab8f0e2883ea043b834d6db5438318a7823960a49828f277bbc"} Jan 29 17:08:28 crc kubenswrapper[4886]: I0129 17:08:28.673286 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerStarted","Data":"c99184ccff1048cbbd7bc7dc522f9a1c02ed8d7c96b828fa7d43e50b4bf7d853"} Jan 29 17:08:28 crc kubenswrapper[4886]: E0129 17:08:28.729973 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 17:08:28 crc kubenswrapper[4886]: E0129 17:08:28.738428 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 17:08:28 crc kubenswrapper[4886]: E0129 17:08:28.762454 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 17:08:28 crc kubenswrapper[4886]: E0129 17:08:28.762539 4886 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-54f8bbfbf-9qjxm" podUID="92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" containerName="heat-engine" Jan 29 17:08:28 crc kubenswrapper[4886]: I0129 17:08:28.823557 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.823533952 podStartE2EDuration="5.823533952s" podCreationTimestamp="2026-01-29 17:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:28.800193895 +0000 UTC m=+2791.708913177" watchObservedRunningTime="2026-01-29 17:08:28.823533952 +0000 UTC m=+2791.732253224" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.014867 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.118784 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data-custom\") pod \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.118831 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-combined-ca-bundle\") pod \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.118944 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data\") pod \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.126543 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6cjb\" (UniqueName: \"kubernetes.io/projected/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-kube-api-access-p6cjb\") pod \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.198966 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.216627 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7b6ce536-47ec-45b9-b926-28f1fa7eb80a" (UID: "7b6ce536-47ec-45b9-b926-28f1fa7eb80a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.216717 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-kube-api-access-p6cjb" (OuterVolumeSpecName: "kube-api-access-p6cjb") pod "7b6ce536-47ec-45b9-b926-28f1fa7eb80a" (UID: "7b6ce536-47ec-45b9-b926-28f1fa7eb80a"). InnerVolumeSpecName "kube-api-access-p6cjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.231605 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b6ce536-47ec-45b9-b926-28f1fa7eb80a" (UID: "7b6ce536-47ec-45b9-b926-28f1fa7eb80a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.243391 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-combined-ca-bundle\") pod \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\" (UID: \"7b6ce536-47ec-45b9-b926-28f1fa7eb80a\") " Jan 29 17:08:29 crc kubenswrapper[4886]: W0129 17:08:29.244023 4886 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7b6ce536-47ec-45b9-b926-28f1fa7eb80a/volumes/kubernetes.io~secret/combined-ca-bundle Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.244049 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b6ce536-47ec-45b9-b926-28f1fa7eb80a" (UID: "7b6ce536-47ec-45b9-b926-28f1fa7eb80a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.244592 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.244638 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.244674 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6cjb\" (UniqueName: \"kubernetes.io/projected/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-kube-api-access-p6cjb\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.295220 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data" (OuterVolumeSpecName: "config-data") pod "7b6ce536-47ec-45b9-b926-28f1fa7eb80a" (UID: "7b6ce536-47ec-45b9-b926-28f1fa7eb80a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.350273 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb8rb\" (UniqueName: \"kubernetes.io/projected/04a4a757-71c6-46ec-9019-8d2f64be8285-kube-api-access-bb8rb\") pod \"04a4a757-71c6-46ec-9019-8d2f64be8285\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.350343 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data\") pod \"04a4a757-71c6-46ec-9019-8d2f64be8285\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.350435 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data-custom\") pod \"04a4a757-71c6-46ec-9019-8d2f64be8285\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.350470 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-combined-ca-bundle\") pod \"04a4a757-71c6-46ec-9019-8d2f64be8285\" (UID: \"04a4a757-71c6-46ec-9019-8d2f64be8285\") " Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.350853 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6ce536-47ec-45b9-b926-28f1fa7eb80a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.354161 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "04a4a757-71c6-46ec-9019-8d2f64be8285" (UID: "04a4a757-71c6-46ec-9019-8d2f64be8285"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.357429 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a4a757-71c6-46ec-9019-8d2f64be8285-kube-api-access-bb8rb" (OuterVolumeSpecName: "kube-api-access-bb8rb") pod "04a4a757-71c6-46ec-9019-8d2f64be8285" (UID: "04a4a757-71c6-46ec-9019-8d2f64be8285"). InnerVolumeSpecName "kube-api-access-bb8rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.392165 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04a4a757-71c6-46ec-9019-8d2f64be8285" (UID: "04a4a757-71c6-46ec-9019-8d2f64be8285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.440401 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data" (OuterVolumeSpecName: "config-data") pod "04a4a757-71c6-46ec-9019-8d2f64be8285" (UID: "04a4a757-71c6-46ec-9019-8d2f64be8285"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.452562 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.452792 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.452851 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb8rb\" (UniqueName: \"kubernetes.io/projected/04a4a757-71c6-46ec-9019-8d2f64be8285-kube-api-access-bb8rb\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.452907 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a4a757-71c6-46ec-9019-8d2f64be8285-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.660442 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.660733 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.686665 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81437be4-b399-40e9-9c33-e71319326af8","Type":"ContainerStarted","Data":"6449b6c9d1c44f3a9f4fafbfeac03bbccd9b2e03eed7084df5eed46099830409"} Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.688994 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c7bddd46c-bnlxj" event={"ID":"7b6ce536-47ec-45b9-b926-28f1fa7eb80a","Type":"ContainerDied","Data":"28c29d3f5a45d8f6e82cfdb663ace90ab610bc4d1d57239fe93c946573d05d45"} Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.689032 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c7bddd46c-bnlxj" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.689049 4886 scope.go:117] "RemoveContainer" containerID="2eb9aac70b8d95e0c6e925aa406b960e03929e9d6915153ce56a560a835d977d" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.706371 4886 generic.go:334] "Generic (PLEG): container finished" podID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerID="75e8cf0cad7d6d59d88f3f3bd6a97cab33d3691af01126d62cdae48b3d82240f" exitCode=0 Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.706444 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7854df7c4b-dn4j7" event={"ID":"0ff8b641-0d76-41ce-b6ac-7d708effebc0","Type":"ContainerDied","Data":"75e8cf0cad7d6d59d88f3f3bd6a97cab33d3691af01126d62cdae48b3d82240f"} Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.716750 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54985c87ff-g5725" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.716800 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54985c87ff-g5725" event={"ID":"04a4a757-71c6-46ec-9019-8d2f64be8285","Type":"ContainerDied","Data":"7f461b34367fc19b6002113f40bc4d964e2fb98d4e2fb8a58fd1680309b095e9"} Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.721438 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.721397462 podStartE2EDuration="4.721397462s" podCreationTimestamp="2026-01-29 17:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:08:29.70891259 +0000 UTC m=+2792.617631862" watchObservedRunningTime="2026-01-29 17:08:29.721397462 +0000 UTC m=+2792.630116724" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.768268 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6c7bddd46c-bnlxj"] Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.780300 4886 scope.go:117] "RemoveContainer" containerID="269b4adc6e6be10392170084dc412e856cfe62aa07302ce9122a8ed94105dabe" Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.786386 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6c7bddd46c-bnlxj"] Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.937643 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-54985c87ff-g5725"] Jan 29 17:08:29 crc kubenswrapper[4886]: I0129 17:08:29.947170 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-54985c87ff-g5725"] Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.310247 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.357465 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c4q4z"] Jan 29 17:08:30 crc kubenswrapper[4886]: E0129 17:08:30.360446 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerName="neutron-httpd" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360476 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerName="neutron-httpd" Jan 29 17:08:30 crc kubenswrapper[4886]: E0129 17:08:30.360502 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" containerName="heat-api" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360511 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" containerName="heat-api" Jan 29 17:08:30 crc kubenswrapper[4886]: E0129 17:08:30.360540 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" containerName="heat-cfnapi" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360547 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" containerName="heat-cfnapi" Jan 29 17:08:30 crc kubenswrapper[4886]: E0129 17:08:30.360560 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" containerName="heat-cfnapi" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360565 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" containerName="heat-cfnapi" Jan 29 17:08:30 crc kubenswrapper[4886]: E0129 17:08:30.360576 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" containerName="heat-api" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360581 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" containerName="heat-api" Jan 29 17:08:30 crc kubenswrapper[4886]: E0129 17:08:30.360594 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerName="neutron-api" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360599 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerName="neutron-api" Jan 29 17:08:30 crc kubenswrapper[4886]: E0129 17:08:30.360609 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af00928-6484-4071-b739-bc211ac220ef" containerName="mariadb-account-create-update" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360615 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af00928-6484-4071-b739-bc211ac220ef" containerName="mariadb-account-create-update" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360930 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" containerName="heat-cfnapi" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360948 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" containerName="heat-api" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360964 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerName="neutron-httpd" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360975 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" containerName="neutron-api" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.360988 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" containerName="heat-cfnapi" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.361002 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af00928-6484-4071-b739-bc211ac220ef" containerName="mariadb-account-create-update" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.361821 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.373374 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.373682 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.373865 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wcdz5" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.403817 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c4q4z"] Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.490591 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-httpd-config\") pod \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.490789 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-combined-ca-bundle\") pod \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.491564 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-ovndb-tls-certs\") pod \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.491735 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhjq8\" (UniqueName: \"kubernetes.io/projected/0ff8b641-0d76-41ce-b6ac-7d708effebc0-kube-api-access-nhjq8\") pod \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.491828 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-config\") pod \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\" (UID: \"0ff8b641-0d76-41ce-b6ac-7d708effebc0\") " Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.492539 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97hdc\" (UniqueName: \"kubernetes.io/projected/c467eb7e-a553-4fc5-b366-607a30fe18dd-kube-api-access-97hdc\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.492615 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-scripts\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.492670 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.493009 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-config-data\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.515876 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0ff8b641-0d76-41ce-b6ac-7d708effebc0" (UID: "0ff8b641-0d76-41ce-b6ac-7d708effebc0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.516070 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff8b641-0d76-41ce-b6ac-7d708effebc0-kube-api-access-nhjq8" (OuterVolumeSpecName: "kube-api-access-nhjq8") pod "0ff8b641-0d76-41ce-b6ac-7d708effebc0" (UID: "0ff8b641-0d76-41ce-b6ac-7d708effebc0"). InnerVolumeSpecName "kube-api-access-nhjq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.595059 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-config-data\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.595174 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97hdc\" (UniqueName: \"kubernetes.io/projected/c467eb7e-a553-4fc5-b366-607a30fe18dd-kube-api-access-97hdc\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.595221 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-scripts\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.595249 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.595386 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhjq8\" (UniqueName: \"kubernetes.io/projected/0ff8b641-0d76-41ce-b6ac-7d708effebc0-kube-api-access-nhjq8\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.595399 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.614186 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-config-data\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.624756 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-scripts\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.629930 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97hdc\" (UniqueName: \"kubernetes.io/projected/c467eb7e-a553-4fc5-b366-607a30fe18dd-kube-api-access-97hdc\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.634297 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c4q4z\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.643371 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a4a757-71c6-46ec-9019-8d2f64be8285" path="/var/lib/kubelet/pods/04a4a757-71c6-46ec-9019-8d2f64be8285/volumes" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.644346 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" path="/var/lib/kubelet/pods/7b6ce536-47ec-45b9-b926-28f1fa7eb80a/volumes" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.655789 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-config" (OuterVolumeSpecName: "config") pod "0ff8b641-0d76-41ce-b6ac-7d708effebc0" (UID: "0ff8b641-0d76-41ce-b6ac-7d708effebc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.662450 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ff8b641-0d76-41ce-b6ac-7d708effebc0" (UID: "0ff8b641-0d76-41ce-b6ac-7d708effebc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.691871 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.697701 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.697729 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.731241 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7854df7c4b-dn4j7" event={"ID":"0ff8b641-0d76-41ce-b6ac-7d708effebc0","Type":"ContainerDied","Data":"e7a3e9e15910d73e70e0b6e954b7743de9f55b25dd0f0bfd34c348eb738633d2"} Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.731538 4886 scope.go:117] "RemoveContainer" containerID="f3ee0a56aaca61cef2419de911db690ccd8876c78a545e2b8864e16aa4ff333a" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.731660 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7854df7c4b-dn4j7" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.756134 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerStarted","Data":"75f2a13548205c6b54e7c335c35141a38cfd5ad2dd6734bb6fdb9670a340bd2a"} Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.757763 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.760442 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0ff8b641-0d76-41ce-b6ac-7d708effebc0" (UID: "0ff8b641-0d76-41ce-b6ac-7d708effebc0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.793304 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.213679715 podStartE2EDuration="6.793286942s" podCreationTimestamp="2026-01-29 17:08:24 +0000 UTC" firstStartedPulling="2026-01-29 17:08:25.589958567 +0000 UTC m=+2788.498677839" lastFinishedPulling="2026-01-29 17:08:30.169565794 +0000 UTC m=+2793.078285066" observedRunningTime="2026-01-29 17:08:30.787705635 +0000 UTC m=+2793.696424907" watchObservedRunningTime="2026-01-29 17:08:30.793286942 +0000 UTC m=+2793.702006214" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.800174 4886 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff8b641-0d76-41ce-b6ac-7d708effebc0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:30 crc kubenswrapper[4886]: I0129 17:08:30.950580 4886 scope.go:117] "RemoveContainer" containerID="75e8cf0cad7d6d59d88f3f3bd6a97cab33d3691af01126d62cdae48b3d82240f" Jan 29 17:08:31 crc kubenswrapper[4886]: I0129 17:08:31.087045 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7854df7c4b-dn4j7"] Jan 29 17:08:31 crc kubenswrapper[4886]: I0129 17:08:31.096548 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7854df7c4b-dn4j7"] Jan 29 17:08:31 crc kubenswrapper[4886]: I0129 17:08:31.357226 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c4q4z"] Jan 29 17:08:31 crc kubenswrapper[4886]: W0129 17:08:31.362012 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc467eb7e_a553_4fc5_b366_607a30fe18dd.slice/crio-e030969deba149d036416125fae7ad0b0c1ce2a5efabff4aeea1c2936fb7a1ec WatchSource:0}: Error finding container e030969deba149d036416125fae7ad0b0c1ce2a5efabff4aeea1c2936fb7a1ec: Status 404 returned error can't find the container with id e030969deba149d036416125fae7ad0b0c1ce2a5efabff4aeea1c2936fb7a1ec Jan 29 17:08:31 crc kubenswrapper[4886]: I0129 17:08:31.806566 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" event={"ID":"c467eb7e-a553-4fc5-b366-607a30fe18dd","Type":"ContainerStarted","Data":"e030969deba149d036416125fae7ad0b0c1ce2a5efabff4aeea1c2936fb7a1ec"} Jan 29 17:08:32 crc kubenswrapper[4886]: I0129 17:08:32.540984 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:32 crc kubenswrapper[4886]: I0129 17:08:32.542217 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:32 crc kubenswrapper[4886]: I0129 17:08:32.613961 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:32 crc kubenswrapper[4886]: I0129 17:08:32.629143 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ff8b641-0d76-41ce-b6ac-7d708effebc0" path="/var/lib/kubelet/pods/0ff8b641-0d76-41ce-b6ac-7d708effebc0/volumes" Jan 29 17:08:32 crc kubenswrapper[4886]: I0129 17:08:32.888708 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:32 crc kubenswrapper[4886]: I0129 17:08:32.957531 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vflxs"] Jan 29 17:08:33 crc kubenswrapper[4886]: I0129 17:08:33.592106 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 17:08:33 crc kubenswrapper[4886]: I0129 17:08:33.592192 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 17:08:33 crc kubenswrapper[4886]: I0129 17:08:33.682112 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 17:08:33 crc kubenswrapper[4886]: I0129 17:08:33.787369 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 17:08:33 crc kubenswrapper[4886]: I0129 17:08:33.847444 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 17:08:33 crc kubenswrapper[4886]: I0129 17:08:33.847486 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 17:08:34 crc kubenswrapper[4886]: I0129 17:08:34.860731 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vflxs" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="registry-server" containerID="cri-o://62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903" gracePeriod=2 Jan 29 17:08:35 crc kubenswrapper[4886]: I0129 17:08:35.877541 4886 generic.go:334] "Generic (PLEG): container finished" podID="18c5f721-30d1-48de-97e4-52399587c9d1" containerID="62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903" exitCode=0 Jan 29 17:08:35 crc kubenswrapper[4886]: I0129 17:08:35.877614 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vflxs" event={"ID":"18c5f721-30d1-48de-97e4-52399587c9d1","Type":"ContainerDied","Data":"62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903"} Jan 29 17:08:36 crc kubenswrapper[4886]: I0129 17:08:36.111339 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:36 crc kubenswrapper[4886]: I0129 17:08:36.111389 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:36 crc kubenswrapper[4886]: I0129 17:08:36.189604 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:36 crc kubenswrapper[4886]: I0129 17:08:36.189692 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:36 crc kubenswrapper[4886]: I0129 17:08:36.906960 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:36 crc kubenswrapper[4886]: I0129 17:08:36.907446 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:38 crc kubenswrapper[4886]: E0129 17:08:38.711315 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 17:08:38 crc kubenswrapper[4886]: E0129 17:08:38.717111 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 17:08:38 crc kubenswrapper[4886]: E0129 17:08:38.720037 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 17:08:38 crc kubenswrapper[4886]: E0129 17:08:38.720099 4886 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-54f8bbfbf-9qjxm" podUID="92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" containerName="heat-engine" Jan 29 17:08:38 crc kubenswrapper[4886]: I0129 17:08:38.924787 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 17:08:38 crc kubenswrapper[4886]: I0129 17:08:38.924829 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 17:08:38 crc kubenswrapper[4886]: I0129 17:08:38.987149 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:38 crc kubenswrapper[4886]: I0129 17:08:38.987453 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="ceilometer-central-agent" containerID="cri-o://3ddc8827ee40ed9c34df4f01749ce22387bf3f776bb544ffddfacdc88b3c01b2" gracePeriod=30 Jan 29 17:08:38 crc kubenswrapper[4886]: I0129 17:08:38.987568 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="proxy-httpd" containerID="cri-o://75f2a13548205c6b54e7c335c35141a38cfd5ad2dd6734bb6fdb9670a340bd2a" gracePeriod=30 Jan 29 17:08:38 crc kubenswrapper[4886]: I0129 17:08:38.987692 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="ceilometer-notification-agent" containerID="cri-o://c99184ccff1048cbbd7bc7dc522f9a1c02ed8d7c96b828fa7d43e50b4bf7d853" gracePeriod=30 Jan 29 17:08:38 crc kubenswrapper[4886]: I0129 17:08:38.987687 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="sg-core" containerID="cri-o://16b1fa849040aab8f0e2883ea043b834d6db5438318a7823960a49828f277bbc" gracePeriod=30 Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.945195 4886 generic.go:334] "Generic (PLEG): container finished" podID="291e8ff3-6792-4900-86a1-df3730548041" containerID="75f2a13548205c6b54e7c335c35141a38cfd5ad2dd6734bb6fdb9670a340bd2a" exitCode=0 Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.945238 4886 generic.go:334] "Generic (PLEG): container finished" podID="291e8ff3-6792-4900-86a1-df3730548041" containerID="16b1fa849040aab8f0e2883ea043b834d6db5438318a7823960a49828f277bbc" exitCode=2 Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.945251 4886 generic.go:334] "Generic (PLEG): container finished" podID="291e8ff3-6792-4900-86a1-df3730548041" containerID="3ddc8827ee40ed9c34df4f01749ce22387bf3f776bb544ffddfacdc88b3c01b2" exitCode=0 Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.945276 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerDied","Data":"75f2a13548205c6b54e7c335c35141a38cfd5ad2dd6734bb6fdb9670a340bd2a"} Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.945307 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerDied","Data":"16b1fa849040aab8f0e2883ea043b834d6db5438318a7823960a49828f277bbc"} Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.945317 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerDied","Data":"3ddc8827ee40ed9c34df4f01749ce22387bf3f776bb544ffddfacdc88b3c01b2"} Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.962504 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.962596 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.963668 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.991012 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:39 crc kubenswrapper[4886]: I0129 17:08:39.991089 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 17:08:40 crc kubenswrapper[4886]: I0129 17:08:40.531451 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 17:08:40 crc kubenswrapper[4886]: I0129 17:08:40.961291 4886 generic.go:334] "Generic (PLEG): container finished" podID="291e8ff3-6792-4900-86a1-df3730548041" containerID="c99184ccff1048cbbd7bc7dc522f9a1c02ed8d7c96b828fa7d43e50b4bf7d853" exitCode=0 Jan 29 17:08:40 crc kubenswrapper[4886]: I0129 17:08:40.961533 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerDied","Data":"c99184ccff1048cbbd7bc7dc522f9a1c02ed8d7c96b828fa7d43e50b4bf7d853"} Jan 29 17:08:42 crc kubenswrapper[4886]: E0129 17:08:42.541609 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903 is running failed: container process not found" containerID="62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903" cmd=["grpc_health_probe","-addr=:50051"] Jan 29 17:08:42 crc kubenswrapper[4886]: E0129 17:08:42.542368 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903 is running failed: container process not found" containerID="62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903" cmd=["grpc_health_probe","-addr=:50051"] Jan 29 17:08:42 crc kubenswrapper[4886]: E0129 17:08:42.542676 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903 is running failed: container process not found" containerID="62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903" cmd=["grpc_health_probe","-addr=:50051"] Jan 29 17:08:42 crc kubenswrapper[4886]: E0129 17:08:42.542712 4886 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-vflxs" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="registry-server" Jan 29 17:08:42 crc kubenswrapper[4886]: I0129 17:08:42.982828 4886 generic.go:334] "Generic (PLEG): container finished" podID="92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" containerID="b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" exitCode=0 Jan 29 17:08:42 crc kubenswrapper[4886]: I0129 17:08:42.982972 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54f8bbfbf-9qjxm" event={"ID":"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f","Type":"ContainerDied","Data":"b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f"} Jan 29 17:08:43 crc kubenswrapper[4886]: I0129 17:08:43.998970 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vflxs" event={"ID":"18c5f721-30d1-48de-97e4-52399587c9d1","Type":"ContainerDied","Data":"fe354152829de757ca5537dde1fd3cfc8eb62b13a98c62b74ae6e9f6ed2f435c"} Jan 29 17:08:43 crc kubenswrapper[4886]: I0129 17:08:43.999245 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe354152829de757ca5537dde1fd3cfc8eb62b13a98c62b74ae6e9f6ed2f435c" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.158963 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.239168 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-catalog-content\") pod \"18c5f721-30d1-48de-97e4-52399587c9d1\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.239264 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-utilities\") pod \"18c5f721-30d1-48de-97e4-52399587c9d1\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.240039 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-utilities" (OuterVolumeSpecName: "utilities") pod "18c5f721-30d1-48de-97e4-52399587c9d1" (UID: "18c5f721-30d1-48de-97e4-52399587c9d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.240240 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tzn7\" (UniqueName: \"kubernetes.io/projected/18c5f721-30d1-48de-97e4-52399587c9d1-kube-api-access-2tzn7\") pod \"18c5f721-30d1-48de-97e4-52399587c9d1\" (UID: \"18c5f721-30d1-48de-97e4-52399587c9d1\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.240959 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.249581 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c5f721-30d1-48de-97e4-52399587c9d1-kube-api-access-2tzn7" (OuterVolumeSpecName: "kube-api-access-2tzn7") pod "18c5f721-30d1-48de-97e4-52399587c9d1" (UID: "18c5f721-30d1-48de-97e4-52399587c9d1"). InnerVolumeSpecName "kube-api-access-2tzn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.289733 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18c5f721-30d1-48de-97e4-52399587c9d1" (UID: "18c5f721-30d1-48de-97e4-52399587c9d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.344468 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c5f721-30d1-48de-97e4-52399587c9d1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.344752 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tzn7\" (UniqueName: \"kubernetes.io/projected/18c5f721-30d1-48de-97e4-52399587c9d1-kube-api-access-2tzn7\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.617645 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.629648 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754074 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data\") pod \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754126 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data-custom\") pod \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754183 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-run-httpd\") pod \"291e8ff3-6792-4900-86a1-df3730548041\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754223 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-combined-ca-bundle\") pod \"291e8ff3-6792-4900-86a1-df3730548041\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754290 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-sg-core-conf-yaml\") pod \"291e8ff3-6792-4900-86a1-df3730548041\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754353 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-log-httpd\") pod \"291e8ff3-6792-4900-86a1-df3730548041\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754443 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmdsd\" (UniqueName: \"kubernetes.io/projected/291e8ff3-6792-4900-86a1-df3730548041-kube-api-access-hmdsd\") pod \"291e8ff3-6792-4900-86a1-df3730548041\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754476 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-combined-ca-bundle\") pod \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754538 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-scripts\") pod \"291e8ff3-6792-4900-86a1-df3730548041\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754552 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-config-data\") pod \"291e8ff3-6792-4900-86a1-df3730548041\" (UID: \"291e8ff3-6792-4900-86a1-df3730548041\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.754592 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn4rg\" (UniqueName: \"kubernetes.io/projected/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-kube-api-access-bn4rg\") pod \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\" (UID: \"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f\") " Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.756021 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "291e8ff3-6792-4900-86a1-df3730548041" (UID: "291e8ff3-6792-4900-86a1-df3730548041"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.756869 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "291e8ff3-6792-4900-86a1-df3730548041" (UID: "291e8ff3-6792-4900-86a1-df3730548041"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.758880 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-scripts" (OuterVolumeSpecName: "scripts") pod "291e8ff3-6792-4900-86a1-df3730548041" (UID: "291e8ff3-6792-4900-86a1-df3730548041"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.759363 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291e8ff3-6792-4900-86a1-df3730548041-kube-api-access-hmdsd" (OuterVolumeSpecName: "kube-api-access-hmdsd") pod "291e8ff3-6792-4900-86a1-df3730548041" (UID: "291e8ff3-6792-4900-86a1-df3730548041"). InnerVolumeSpecName "kube-api-access-hmdsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.759547 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-kube-api-access-bn4rg" (OuterVolumeSpecName: "kube-api-access-bn4rg") pod "92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" (UID: "92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f"). InnerVolumeSpecName "kube-api-access-bn4rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.760801 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" (UID: "92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.789808 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" (UID: "92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.794923 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "291e8ff3-6792-4900-86a1-df3730548041" (UID: "291e8ff3-6792-4900-86a1-df3730548041"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.828797 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data" (OuterVolumeSpecName: "config-data") pod "92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" (UID: "92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857775 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857811 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857822 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857831 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857839 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/291e8ff3-6792-4900-86a1-df3730548041-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857848 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmdsd\" (UniqueName: \"kubernetes.io/projected/291e8ff3-6792-4900-86a1-df3730548041-kube-api-access-hmdsd\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857858 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857867 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.857876 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn4rg\" (UniqueName: \"kubernetes.io/projected/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f-kube-api-access-bn4rg\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.867097 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "291e8ff3-6792-4900-86a1-df3730548041" (UID: "291e8ff3-6792-4900-86a1-df3730548041"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.887478 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-config-data" (OuterVolumeSpecName: "config-data") pod "291e8ff3-6792-4900-86a1-df3730548041" (UID: "291e8ff3-6792-4900-86a1-df3730548041"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.960358 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:44 crc kubenswrapper[4886]: I0129 17:08:44.960398 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291e8ff3-6792-4900-86a1-df3730548041-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.015584 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"291e8ff3-6792-4900-86a1-df3730548041","Type":"ContainerDied","Data":"e9683c7a0a1e9a4a4afcaf55416c4d002525f6149a721a2eb46199347f8c0103"} Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.015642 4886 scope.go:117] "RemoveContainer" containerID="75f2a13548205c6b54e7c335c35141a38cfd5ad2dd6734bb6fdb9670a340bd2a" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.015778 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.021256 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54f8bbfbf-9qjxm" event={"ID":"92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f","Type":"ContainerDied","Data":"0f319e6982b89bee08a0388a5eb4c63bb973328dc67504ccea174e9928171156"} Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.021376 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54f8bbfbf-9qjxm" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.031883 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vflxs" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.032638 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" event={"ID":"c467eb7e-a553-4fc5-b366-607a30fe18dd","Type":"ContainerStarted","Data":"b316bbc4bed9ea6d21a1f48ac1daf91a604e958e8664a1c95a0d70b2476abcfa"} Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.068757 4886 scope.go:117] "RemoveContainer" containerID="16b1fa849040aab8f0e2883ea043b834d6db5438318a7823960a49828f277bbc" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.132204 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" podStartSLOduration=2.5024291720000003 podStartE2EDuration="15.132178266s" podCreationTimestamp="2026-01-29 17:08:30 +0000 UTC" firstStartedPulling="2026-01-29 17:08:31.369684627 +0000 UTC m=+2794.278403909" lastFinishedPulling="2026-01-29 17:08:43.999433731 +0000 UTC m=+2806.908153003" observedRunningTime="2026-01-29 17:08:45.066785934 +0000 UTC m=+2807.975505216" watchObservedRunningTime="2026-01-29 17:08:45.132178266 +0000 UTC m=+2808.040897538" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.156382 4886 scope.go:117] "RemoveContainer" containerID="c99184ccff1048cbbd7bc7dc522f9a1c02ed8d7c96b828fa7d43e50b4bf7d853" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.169956 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vflxs"] Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.191932 4886 scope.go:117] "RemoveContainer" containerID="3ddc8827ee40ed9c34df4f01749ce22387bf3f776bb544ffddfacdc88b3c01b2" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.196376 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vflxs"] Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.208019 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-54f8bbfbf-9qjxm"] Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.219286 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-54f8bbfbf-9qjxm"] Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.236792 4886 scope.go:117] "RemoveContainer" containerID="b974dc7a13dfe4723bbe5629a3fd12f5dbc56e7cab5fd25c13a1d891ca45ce3f" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.237441 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.254382 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.270620 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:45 crc kubenswrapper[4886]: E0129 17:08:45.271179 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" containerName="heat-engine" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271200 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" containerName="heat-engine" Jan 29 17:08:45 crc kubenswrapper[4886]: E0129 17:08:45.271306 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="proxy-httpd" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271316 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="proxy-httpd" Jan 29 17:08:45 crc kubenswrapper[4886]: E0129 17:08:45.271347 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="ceilometer-central-agent" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271356 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="ceilometer-central-agent" Jan 29 17:08:45 crc kubenswrapper[4886]: E0129 17:08:45.271369 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="registry-server" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271377 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="registry-server" Jan 29 17:08:45 crc kubenswrapper[4886]: E0129 17:08:45.271394 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="extract-utilities" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271402 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="extract-utilities" Jan 29 17:08:45 crc kubenswrapper[4886]: E0129 17:08:45.271418 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="extract-content" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271426 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="extract-content" Jan 29 17:08:45 crc kubenswrapper[4886]: E0129 17:08:45.271439 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="sg-core" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271446 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="sg-core" Jan 29 17:08:45 crc kubenswrapper[4886]: E0129 17:08:45.271467 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="ceilometer-notification-agent" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271475 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="ceilometer-notification-agent" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271831 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="ceilometer-central-agent" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271860 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6ce536-47ec-45b9-b926-28f1fa7eb80a" containerName="heat-api" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271877 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" containerName="registry-server" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271889 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="ceilometer-notification-agent" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271904 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="sg-core" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271927 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" containerName="heat-engine" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.271948 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="291e8ff3-6792-4900-86a1-df3730548041" containerName="proxy-httpd" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.274593 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.277706 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.281542 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.313376 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.373065 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-config-data\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.373129 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.373275 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.373306 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrz2g\" (UniqueName: \"kubernetes.io/projected/dc82dcdd-793c-4083-9143-1b04037f40d3-kube-api-access-wrz2g\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.373353 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-run-httpd\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.373381 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-scripts\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.373426 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-log-httpd\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.475437 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-log-httpd\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.475713 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-config-data\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.475784 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.475909 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-log-httpd\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.475956 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.476010 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrz2g\" (UniqueName: \"kubernetes.io/projected/dc82dcdd-793c-4083-9143-1b04037f40d3-kube-api-access-wrz2g\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.476074 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-run-httpd\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.476105 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-scripts\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.477150 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-run-httpd\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.480593 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-scripts\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.480666 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.480678 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-config-data\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.481850 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.499136 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrz2g\" (UniqueName: \"kubernetes.io/projected/dc82dcdd-793c-4083-9143-1b04037f40d3-kube-api-access-wrz2g\") pod \"ceilometer-0\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " pod="openstack/ceilometer-0" Jan 29 17:08:45 crc kubenswrapper[4886]: I0129 17:08:45.699409 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:08:46 crc kubenswrapper[4886]: I0129 17:08:46.185890 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:46 crc kubenswrapper[4886]: W0129 17:08:46.193667 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc82dcdd_793c_4083_9143_1b04037f40d3.slice/crio-17d5fd5f42ae0736004ca73847456c411bd6a9d8d5a5c3344ecb73c5ac5a2736 WatchSource:0}: Error finding container 17d5fd5f42ae0736004ca73847456c411bd6a9d8d5a5c3344ecb73c5ac5a2736: Status 404 returned error can't find the container with id 17d5fd5f42ae0736004ca73847456c411bd6a9d8d5a5c3344ecb73c5ac5a2736 Jan 29 17:08:46 crc kubenswrapper[4886]: I0129 17:08:46.629123 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c5f721-30d1-48de-97e4-52399587c9d1" path="/var/lib/kubelet/pods/18c5f721-30d1-48de-97e4-52399587c9d1/volumes" Jan 29 17:08:46 crc kubenswrapper[4886]: I0129 17:08:46.630350 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="291e8ff3-6792-4900-86a1-df3730548041" path="/var/lib/kubelet/pods/291e8ff3-6792-4900-86a1-df3730548041/volumes" Jan 29 17:08:46 crc kubenswrapper[4886]: I0129 17:08:46.631520 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f" path="/var/lib/kubelet/pods/92e92176-b984-4dd5-8ea0-8bcb3dbe5e2f/volumes" Jan 29 17:08:47 crc kubenswrapper[4886]: I0129 17:08:47.062353 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerStarted","Data":"17d5fd5f42ae0736004ca73847456c411bd6a9d8d5a5c3344ecb73c5ac5a2736"} Jan 29 17:08:49 crc kubenswrapper[4886]: I0129 17:08:49.085909 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerStarted","Data":"027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97"} Jan 29 17:08:49 crc kubenswrapper[4886]: I0129 17:08:49.086534 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerStarted","Data":"8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20"} Jan 29 17:08:50 crc kubenswrapper[4886]: I0129 17:08:50.105199 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerStarted","Data":"6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8"} Jan 29 17:08:54 crc kubenswrapper[4886]: I0129 17:08:54.160549 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerStarted","Data":"b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4"} Jan 29 17:08:54 crc kubenswrapper[4886]: I0129 17:08:54.161241 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:08:54 crc kubenswrapper[4886]: I0129 17:08:54.190773 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.36065191 podStartE2EDuration="9.190754165s" podCreationTimestamp="2026-01-29 17:08:45 +0000 UTC" firstStartedPulling="2026-01-29 17:08:46.197297995 +0000 UTC m=+2809.106017267" lastFinishedPulling="2026-01-29 17:08:53.02740023 +0000 UTC m=+2815.936119522" observedRunningTime="2026-01-29 17:08:54.178023057 +0000 UTC m=+2817.086742329" watchObservedRunningTime="2026-01-29 17:08:54.190754165 +0000 UTC m=+2817.099473437" Jan 29 17:08:57 crc kubenswrapper[4886]: I0129 17:08:57.716428 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:08:57 crc kubenswrapper[4886]: I0129 17:08:57.717708 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="ceilometer-central-agent" containerID="cri-o://8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20" gracePeriod=30 Jan 29 17:08:57 crc kubenswrapper[4886]: I0129 17:08:57.717731 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="sg-core" containerID="cri-o://6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8" gracePeriod=30 Jan 29 17:08:57 crc kubenswrapper[4886]: I0129 17:08:57.717854 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="ceilometer-notification-agent" containerID="cri-o://027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97" gracePeriod=30 Jan 29 17:08:57 crc kubenswrapper[4886]: I0129 17:08:57.717867 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="proxy-httpd" containerID="cri-o://b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4" gracePeriod=30 Jan 29 17:08:58 crc kubenswrapper[4886]: I0129 17:08:58.201289 4886 generic.go:334] "Generic (PLEG): container finished" podID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerID="b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4" exitCode=0 Jan 29 17:08:58 crc kubenswrapper[4886]: I0129 17:08:58.201671 4886 generic.go:334] "Generic (PLEG): container finished" podID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerID="6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8" exitCode=2 Jan 29 17:08:58 crc kubenswrapper[4886]: I0129 17:08:58.201360 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerDied","Data":"b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4"} Jan 29 17:08:58 crc kubenswrapper[4886]: I0129 17:08:58.201707 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerDied","Data":"6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8"} Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.218159 4886 generic.go:334] "Generic (PLEG): container finished" podID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerID="027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97" exitCode=0 Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.218233 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerDied","Data":"027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97"} Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.220090 4886 generic.go:334] "Generic (PLEG): container finished" podID="c467eb7e-a553-4fc5-b366-607a30fe18dd" containerID="b316bbc4bed9ea6d21a1f48ac1daf91a604e958e8664a1c95a0d70b2476abcfa" exitCode=0 Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.220131 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" event={"ID":"c467eb7e-a553-4fc5-b366-607a30fe18dd","Type":"ContainerDied","Data":"b316bbc4bed9ea6d21a1f48ac1daf91a604e958e8664a1c95a0d70b2476abcfa"} Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.661296 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.661384 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.661437 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.662467 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db3893b2fd9096a13f5744612d4a2bcbba80c7ed2ddb6ffa1307348c351b1963"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:08:59 crc kubenswrapper[4886]: I0129 17:08:59.662536 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://db3893b2fd9096a13f5744612d4a2bcbba80c7ed2ddb6ffa1307348c351b1963" gracePeriod=600 Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.234637 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="db3893b2fd9096a13f5744612d4a2bcbba80c7ed2ddb6ffa1307348c351b1963" exitCode=0 Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.234731 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"db3893b2fd9096a13f5744612d4a2bcbba80c7ed2ddb6ffa1307348c351b1963"} Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.235023 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d"} Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.235048 4886 scope.go:117] "RemoveContainer" containerID="1ef597c576c05004c5148470ade7ddd51ab3cad8d942f918ff09afb054559dfc" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.675381 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.862761 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-scripts\") pod \"c467eb7e-a553-4fc5-b366-607a30fe18dd\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.862868 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97hdc\" (UniqueName: \"kubernetes.io/projected/c467eb7e-a553-4fc5-b366-607a30fe18dd-kube-api-access-97hdc\") pod \"c467eb7e-a553-4fc5-b366-607a30fe18dd\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.862955 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-combined-ca-bundle\") pod \"c467eb7e-a553-4fc5-b366-607a30fe18dd\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.863039 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-config-data\") pod \"c467eb7e-a553-4fc5-b366-607a30fe18dd\" (UID: \"c467eb7e-a553-4fc5-b366-607a30fe18dd\") " Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.870945 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-scripts" (OuterVolumeSpecName: "scripts") pod "c467eb7e-a553-4fc5-b366-607a30fe18dd" (UID: "c467eb7e-a553-4fc5-b366-607a30fe18dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.871135 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c467eb7e-a553-4fc5-b366-607a30fe18dd-kube-api-access-97hdc" (OuterVolumeSpecName: "kube-api-access-97hdc") pod "c467eb7e-a553-4fc5-b366-607a30fe18dd" (UID: "c467eb7e-a553-4fc5-b366-607a30fe18dd"). InnerVolumeSpecName "kube-api-access-97hdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.896470 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c467eb7e-a553-4fc5-b366-607a30fe18dd" (UID: "c467eb7e-a553-4fc5-b366-607a30fe18dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.896537 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-config-data" (OuterVolumeSpecName: "config-data") pod "c467eb7e-a553-4fc5-b366-607a30fe18dd" (UID: "c467eb7e-a553-4fc5-b366-607a30fe18dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.966918 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.967209 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97hdc\" (UniqueName: \"kubernetes.io/projected/c467eb7e-a553-4fc5-b366-607a30fe18dd-kube-api-access-97hdc\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.967393 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:00 crc kubenswrapper[4886]: I0129 17:09:00.967522 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c467eb7e-a553-4fc5-b366-607a30fe18dd-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.252767 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" event={"ID":"c467eb7e-a553-4fc5-b366-607a30fe18dd","Type":"ContainerDied","Data":"e030969deba149d036416125fae7ad0b0c1ce2a5efabff4aeea1c2936fb7a1ec"} Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.253101 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e030969deba149d036416125fae7ad0b0c1ce2a5efabff4aeea1c2936fb7a1ec" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.253007 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c4q4z" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.360145 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 17:09:01 crc kubenswrapper[4886]: E0129 17:09:01.360624 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c467eb7e-a553-4fc5-b366-607a30fe18dd" containerName="nova-cell0-conductor-db-sync" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.360642 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c467eb7e-a553-4fc5-b366-607a30fe18dd" containerName="nova-cell0-conductor-db-sync" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.360977 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c467eb7e-a553-4fc5-b366-607a30fe18dd" containerName="nova-cell0-conductor-db-sync" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.361792 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.363828 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wcdz5" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.364535 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.376308 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc7sn\" (UniqueName: \"kubernetes.io/projected/bb22403c-016a-48ea-954a-b7b14ea77d7f-kube-api-access-bc7sn\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.376365 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb22403c-016a-48ea-954a-b7b14ea77d7f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.376472 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb22403c-016a-48ea-954a-b7b14ea77d7f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.382254 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.479586 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc7sn\" (UniqueName: \"kubernetes.io/projected/bb22403c-016a-48ea-954a-b7b14ea77d7f-kube-api-access-bc7sn\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.479647 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb22403c-016a-48ea-954a-b7b14ea77d7f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.479775 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb22403c-016a-48ea-954a-b7b14ea77d7f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.492007 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb22403c-016a-48ea-954a-b7b14ea77d7f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.492399 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb22403c-016a-48ea-954a-b7b14ea77d7f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.501520 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc7sn\" (UniqueName: \"kubernetes.io/projected/bb22403c-016a-48ea-954a-b7b14ea77d7f-kube-api-access-bc7sn\") pod \"nova-cell0-conductor-0\" (UID: \"bb22403c-016a-48ea-954a-b7b14ea77d7f\") " pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:01 crc kubenswrapper[4886]: I0129 17:09:01.685825 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:02 crc kubenswrapper[4886]: I0129 17:09:02.170951 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 17:09:02 crc kubenswrapper[4886]: I0129 17:09:02.292808 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb22403c-016a-48ea-954a-b7b14ea77d7f","Type":"ContainerStarted","Data":"9b6695254391d39cd72ea747b0a5494ab7ebb80ca161b9598778aa51d461fb31"} Jan 29 17:09:02 crc kubenswrapper[4886]: I0129 17:09:02.868057 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.041950 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-combined-ca-bundle\") pod \"dc82dcdd-793c-4083-9143-1b04037f40d3\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.042045 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-scripts\") pod \"dc82dcdd-793c-4083-9143-1b04037f40d3\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.042199 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrz2g\" (UniqueName: \"kubernetes.io/projected/dc82dcdd-793c-4083-9143-1b04037f40d3-kube-api-access-wrz2g\") pod \"dc82dcdd-793c-4083-9143-1b04037f40d3\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.042245 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-log-httpd\") pod \"dc82dcdd-793c-4083-9143-1b04037f40d3\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.042279 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-config-data\") pod \"dc82dcdd-793c-4083-9143-1b04037f40d3\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.042386 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-sg-core-conf-yaml\") pod \"dc82dcdd-793c-4083-9143-1b04037f40d3\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.042431 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-run-httpd\") pod \"dc82dcdd-793c-4083-9143-1b04037f40d3\" (UID: \"dc82dcdd-793c-4083-9143-1b04037f40d3\") " Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.044118 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc82dcdd-793c-4083-9143-1b04037f40d3" (UID: "dc82dcdd-793c-4083-9143-1b04037f40d3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.044482 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc82dcdd-793c-4083-9143-1b04037f40d3" (UID: "dc82dcdd-793c-4083-9143-1b04037f40d3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.048801 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-scripts" (OuterVolumeSpecName: "scripts") pod "dc82dcdd-793c-4083-9143-1b04037f40d3" (UID: "dc82dcdd-793c-4083-9143-1b04037f40d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.049854 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc82dcdd-793c-4083-9143-1b04037f40d3-kube-api-access-wrz2g" (OuterVolumeSpecName: "kube-api-access-wrz2g") pod "dc82dcdd-793c-4083-9143-1b04037f40d3" (UID: "dc82dcdd-793c-4083-9143-1b04037f40d3"). InnerVolumeSpecName "kube-api-access-wrz2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.106962 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc82dcdd-793c-4083-9143-1b04037f40d3" (UID: "dc82dcdd-793c-4083-9143-1b04037f40d3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.146361 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.146402 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.146414 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.146448 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrz2g\" (UniqueName: \"kubernetes.io/projected/dc82dcdd-793c-4083-9143-1b04037f40d3-kube-api-access-wrz2g\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.146461 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc82dcdd-793c-4083-9143-1b04037f40d3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.185960 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc82dcdd-793c-4083-9143-1b04037f40d3" (UID: "dc82dcdd-793c-4083-9143-1b04037f40d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.212209 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-config-data" (OuterVolumeSpecName: "config-data") pod "dc82dcdd-793c-4083-9143-1b04037f40d3" (UID: "dc82dcdd-793c-4083-9143-1b04037f40d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.249182 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.249226 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc82dcdd-793c-4083-9143-1b04037f40d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.313460 4886 generic.go:334] "Generic (PLEG): container finished" podID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerID="8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20" exitCode=0 Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.313562 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerDied","Data":"8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20"} Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.314884 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc82dcdd-793c-4083-9143-1b04037f40d3","Type":"ContainerDied","Data":"17d5fd5f42ae0736004ca73847456c411bd6a9d8d5a5c3344ecb73c5ac5a2736"} Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.314974 4886 scope.go:117] "RemoveContainer" containerID="b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.313603 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.328449 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb22403c-016a-48ea-954a-b7b14ea77d7f","Type":"ContainerStarted","Data":"c465162f1ad3d58d9d3acd7ece43f775baddecdcd0956b5a30e3866d2383acf1"} Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.330046 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.369588 4886 scope.go:117] "RemoveContainer" containerID="6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.394445 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.394419173 podStartE2EDuration="2.394419173s" podCreationTimestamp="2026-01-29 17:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:03.390884034 +0000 UTC m=+2826.299603306" watchObservedRunningTime="2026-01-29 17:09:03.394419173 +0000 UTC m=+2826.303138445" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.426657 4886 scope.go:117] "RemoveContainer" containerID="027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.452578 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.476788 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.504425 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:03 crc kubenswrapper[4886]: E0129 17:09:03.505112 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="ceilometer-central-agent" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.505141 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="ceilometer-central-agent" Jan 29 17:09:03 crc kubenswrapper[4886]: E0129 17:09:03.505163 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="proxy-httpd" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.505172 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="proxy-httpd" Jan 29 17:09:03 crc kubenswrapper[4886]: E0129 17:09:03.505200 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="ceilometer-notification-agent" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.505209 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="ceilometer-notification-agent" Jan 29 17:09:03 crc kubenswrapper[4886]: E0129 17:09:03.505255 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="sg-core" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.505264 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="sg-core" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.505572 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="ceilometer-notification-agent" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.505607 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="sg-core" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.505641 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="proxy-httpd" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.505659 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" containerName="ceilometer-central-agent" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.523962 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.524123 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.530853 4886 scope.go:117] "RemoveContainer" containerID="8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.531094 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.531177 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.587542 4886 scope.go:117] "RemoveContainer" containerID="b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4" Jan 29 17:09:03 crc kubenswrapper[4886]: E0129 17:09:03.588797 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4\": container with ID starting with b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4 not found: ID does not exist" containerID="b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.588854 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4"} err="failed to get container status \"b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4\": rpc error: code = NotFound desc = could not find container \"b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4\": container with ID starting with b4bbd9c439d2c24659fb57b3faf885aaff4aa720b408e45a5289e66ac74560d4 not found: ID does not exist" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.588889 4886 scope.go:117] "RemoveContainer" containerID="6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8" Jan 29 17:09:03 crc kubenswrapper[4886]: E0129 17:09:03.589881 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8\": container with ID starting with 6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8 not found: ID does not exist" containerID="6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.589926 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8"} err="failed to get container status \"6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8\": rpc error: code = NotFound desc = could not find container \"6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8\": container with ID starting with 6549cee8bc993f3edbbbdedce8da615b537aaf75fc4fbbffe8a146e13427c8c8 not found: ID does not exist" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.589992 4886 scope.go:117] "RemoveContainer" containerID="027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97" Jan 29 17:09:03 crc kubenswrapper[4886]: E0129 17:09:03.590459 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97\": container with ID starting with 027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97 not found: ID does not exist" containerID="027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.590490 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97"} err="failed to get container status \"027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97\": rpc error: code = NotFound desc = could not find container \"027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97\": container with ID starting with 027f2f6b9a90551af8155e3f9d55caa5b15fe881b17a34fbffe2e1da19cdee97 not found: ID does not exist" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.590510 4886 scope.go:117] "RemoveContainer" containerID="8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20" Jan 29 17:09:03 crc kubenswrapper[4886]: E0129 17:09:03.591680 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20\": container with ID starting with 8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20 not found: ID does not exist" containerID="8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.591731 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20"} err="failed to get container status \"8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20\": rpc error: code = NotFound desc = could not find container \"8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20\": container with ID starting with 8637ee0b12535652fad4c6c24b400526b4e4e5a64b9711598c8207164cbe4a20 not found: ID does not exist" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.670563 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-run-httpd\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.670633 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-config-data\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.670723 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-scripts\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.670817 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.670917 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.671067 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-log-httpd\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.671120 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk52z\" (UniqueName: \"kubernetes.io/projected/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-kube-api-access-fk52z\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.772849 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-run-httpd\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.772908 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-config-data\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.772967 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-scripts\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.773002 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.773030 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.773105 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-log-httpd\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.773126 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk52z\" (UniqueName: \"kubernetes.io/projected/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-kube-api-access-fk52z\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.773370 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-run-httpd\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.773741 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-log-httpd\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.778018 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.778168 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-scripts\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.781710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.782903 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-config-data\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.790406 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk52z\" (UniqueName: \"kubernetes.io/projected/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-kube-api-access-fk52z\") pod \"ceilometer-0\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " pod="openstack/ceilometer-0" Jan 29 17:09:03 crc kubenswrapper[4886]: I0129 17:09:03.860388 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:04 crc kubenswrapper[4886]: I0129 17:09:04.410819 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:04 crc kubenswrapper[4886]: I0129 17:09:04.630189 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc82dcdd-793c-4083-9143-1b04037f40d3" path="/var/lib/kubelet/pods/dc82dcdd-793c-4083-9143-1b04037f40d3/volumes" Jan 29 17:09:05 crc kubenswrapper[4886]: I0129 17:09:05.354699 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerStarted","Data":"f28b9a9b2e33861b2b8937e8a0acf07992031f2291a0da6c8fc53223704d8f50"} Jan 29 17:09:05 crc kubenswrapper[4886]: I0129 17:09:05.355287 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerStarted","Data":"ee2c96cf4752f271ab59c1e5d9ef8010edcb2061ecccd36a34d602bf9c8f1068"} Jan 29 17:09:06 crc kubenswrapper[4886]: I0129 17:09:06.368644 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerStarted","Data":"8ed383dcd150e84a715deaf0b080e1c2f8bb3800fd02ff47edc2c3516be536cf"} Jan 29 17:09:07 crc kubenswrapper[4886]: I0129 17:09:07.384475 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerStarted","Data":"1d04206c0d41b909492932943b574fcef26ed1b2dfcf90d669a67515dcaabab7"} Jan 29 17:09:09 crc kubenswrapper[4886]: I0129 17:09:09.414138 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerStarted","Data":"b8916a65aaeb4f4e843c5fba061a08311e52e99052d791e323ff6941a73b7589"} Jan 29 17:09:09 crc kubenswrapper[4886]: I0129 17:09:09.415189 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:09:09 crc kubenswrapper[4886]: I0129 17:09:09.450730 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.200388919 podStartE2EDuration="6.450711102s" podCreationTimestamp="2026-01-29 17:09:03 +0000 UTC" firstStartedPulling="2026-01-29 17:09:04.400739776 +0000 UTC m=+2827.309459048" lastFinishedPulling="2026-01-29 17:09:08.651061959 +0000 UTC m=+2831.559781231" observedRunningTime="2026-01-29 17:09:09.450047733 +0000 UTC m=+2832.358767055" watchObservedRunningTime="2026-01-29 17:09:09.450711102 +0000 UTC m=+2832.359430584" Jan 29 17:09:11 crc kubenswrapper[4886]: I0129 17:09:11.728046 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.514196 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-tqcf4"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.517806 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.522859 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.523098 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.535096 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tqcf4"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.599050 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-config-data\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.599105 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-scripts\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.599150 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhvmq\" (UniqueName: \"kubernetes.io/projected/8cabf586-398a-45a9-80d6-2fd63d9e14e5-kube-api-access-vhvmq\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.599379 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.664484 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.666318 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.683852 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.685400 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.701159 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.701276 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-config-data\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.701302 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-scripts\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.701368 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhvmq\" (UniqueName: \"kubernetes.io/projected/8cabf586-398a-45a9-80d6-2fd63d9e14e5-kube-api-access-vhvmq\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.716267 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.718303 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.722035 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.730833 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-scripts\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.735529 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.742086 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.768690 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.770185 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.783985 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.800903 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-config-data\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.804389 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.804467 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63670887-1250-42df-a728-315414be9901-logs\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.804492 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.804783 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqq5\" (UniqueName: \"kubernetes.io/projected/63670887-1250-42df-a728-315414be9901-kube-api-access-frqq5\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.804863 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-config-data\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.804932 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-config-data\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.804952 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sdhg\" (UniqueName: \"kubernetes.io/projected/c24e1f4d-2c34-4496-bd90-4fe840552491-kube-api-access-5sdhg\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.804968 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c24e1f4d-2c34-4496-bd90-4fe840552491-logs\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.810438 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhvmq\" (UniqueName: \"kubernetes.io/projected/8cabf586-398a-45a9-80d6-2fd63d9e14e5-kube-api-access-vhvmq\") pod \"nova-cell0-cell-mapping-tqcf4\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.821249 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.843221 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908173 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-config-data\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908246 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908265 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dblx2\" (UniqueName: \"kubernetes.io/projected/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-kube-api-access-dblx2\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908307 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63670887-1250-42df-a728-315414be9901-logs\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908342 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908384 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frqq5\" (UniqueName: \"kubernetes.io/projected/63670887-1250-42df-a728-315414be9901-kube-api-access-frqq5\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908420 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-config-data\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908458 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-config-data\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908476 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sdhg\" (UniqueName: \"kubernetes.io/projected/c24e1f4d-2c34-4496-bd90-4fe840552491-kube-api-access-5sdhg\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908492 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c24e1f4d-2c34-4496-bd90-4fe840552491-logs\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.908511 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.909945 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63670887-1250-42df-a728-315414be9901-logs\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.910763 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c24e1f4d-2c34-4496-bd90-4fe840552491-logs\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.937148 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.944893 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-config-data\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.944966 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-config-data\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.957083 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sdhg\" (UniqueName: \"kubernetes.io/projected/c24e1f4d-2c34-4496-bd90-4fe840552491-kube-api-access-5sdhg\") pod \"nova-api-0\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " pod="openstack/nova-api-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.959972 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zdbgk"] Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.961796 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.968077 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frqq5\" (UniqueName: \"kubernetes.io/projected/63670887-1250-42df-a728-315414be9901-kube-api-access-frqq5\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:12 crc kubenswrapper[4886]: I0129 17:09:12.968202 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63670887-1250-42df-a728-315414be9901\") " pod="openstack/nova-metadata-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.002369 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zdbgk"] Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.011282 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dblx2\" (UniqueName: \"kubernetes.io/projected/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-kube-api-access-dblx2\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.011505 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.011558 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-config-data\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.036260 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.038184 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-config-data\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.046432 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dblx2\" (UniqueName: \"kubernetes.io/projected/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-kube-api-access-dblx2\") pod \"nova-scheduler-0\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.049392 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.051304 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.052416 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.083519 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.092938 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.107673 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.113363 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9csz\" (UniqueName: \"kubernetes.io/projected/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-kube-api-access-x9csz\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.113427 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-svc\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.113492 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-config\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.113530 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.113553 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.113592 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.122780 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.216213 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9csz\" (UniqueName: \"kubernetes.io/projected/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-kube-api-access-x9csz\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.216721 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-svc\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.216880 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.216913 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.217040 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-config\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.217137 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.217194 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.217263 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.217336 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmffz\" (UniqueName: \"kubernetes.io/projected/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-kube-api-access-rmffz\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.217777 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-svc\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.219195 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.220008 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.220076 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-config\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.220391 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.268814 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9csz\" (UniqueName: \"kubernetes.io/projected/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-kube-api-access-x9csz\") pod \"dnsmasq-dns-9b86998b5-zdbgk\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.319820 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmffz\" (UniqueName: \"kubernetes.io/projected/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-kube-api-access-rmffz\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.320013 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.320040 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.326762 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.332192 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.344520 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmffz\" (UniqueName: \"kubernetes.io/projected/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-kube-api-access-rmffz\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.434303 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.442614 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.822398 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tqcf4"] Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.913565 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:13 crc kubenswrapper[4886]: I0129 17:09:13.957151 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.112855 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.344340 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:14 crc kubenswrapper[4886]: W0129 17:09:14.346512 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ccf7a7a_f65b_4942_9bfa_bc7a377e6ff1.slice/crio-f636861581833a86368762de32a4ca62df7734738d06a2800f3b6b0ee4fb4aa1 WatchSource:0}: Error finding container f636861581833a86368762de32a4ca62df7734738d06a2800f3b6b0ee4fb4aa1: Status 404 returned error can't find the container with id f636861581833a86368762de32a4ca62df7734738d06a2800f3b6b0ee4fb4aa1 Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.367794 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zdbgk"] Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.494937 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c24e1f4d-2c34-4496-bd90-4fe840552491","Type":"ContainerStarted","Data":"eb8a3baac4fbd0a80179f8a19f3f61fb9fca2e4d5dcfe096915c43ef69238e98"} Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.520895 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tqcf4" event={"ID":"8cabf586-398a-45a9-80d6-2fd63d9e14e5","Type":"ContainerStarted","Data":"d6960d602147a760f370e0aaeba322f8c53999b050075e5ef6c33ecafc0b7928"} Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.520936 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tqcf4" event={"ID":"8cabf586-398a-45a9-80d6-2fd63d9e14e5","Type":"ContainerStarted","Data":"c9ea59738c6ba35a7c3d3e2f05ce7750bd7b76ba456616dc38cec147840a905e"} Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.528240 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63670887-1250-42df-a728-315414be9901","Type":"ContainerStarted","Data":"54233804a9ed5dc337d2e33b8c617c4a33e85a8e6af923aaf251e6cf9186b374"} Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.530551 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3441bcd4-bf8b-406f-b3f5-1c723908bdc4","Type":"ContainerStarted","Data":"95891069401cb7e43c836c472c728a63f5e1133c6a2287df2be68780c76d5016"} Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.541527 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" event={"ID":"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1","Type":"ContainerStarted","Data":"f636861581833a86368762de32a4ca62df7734738d06a2800f3b6b0ee4fb4aa1"} Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.543719 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11","Type":"ContainerStarted","Data":"b9417b27c0621c2b043b290e7d29fbfb8ed923b29824c45f4941d5924a3fcf00"} Jan 29 17:09:14 crc kubenswrapper[4886]: I0129 17:09:14.554702 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-tqcf4" podStartSLOduration=2.554683149 podStartE2EDuration="2.554683149s" podCreationTimestamp="2026-01-29 17:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:14.546492658 +0000 UTC m=+2837.455211940" watchObservedRunningTime="2026-01-29 17:09:14.554683149 +0000 UTC m=+2837.463402421" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.107206 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fznz7"] Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.111738 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.115155 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.115621 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.124135 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fznz7"] Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.172707 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n78gf\" (UniqueName: \"kubernetes.io/projected/a88a08b7-d54a-4414-b7f6-b490949d6b70-kube-api-access-n78gf\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.173107 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-scripts\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.173333 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-config-data\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.173509 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.275669 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-config-data\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.275814 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.275860 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n78gf\" (UniqueName: \"kubernetes.io/projected/a88a08b7-d54a-4414-b7f6-b490949d6b70-kube-api-access-n78gf\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.276018 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-scripts\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.283102 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.284480 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-config-data\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.286017 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-scripts\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.336119 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n78gf\" (UniqueName: \"kubernetes.io/projected/a88a08b7-d54a-4414-b7f6-b490949d6b70-kube-api-access-n78gf\") pod \"nova-cell1-conductor-db-sync-fznz7\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.440836 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.558177 4886 generic.go:334] "Generic (PLEG): container finished" podID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" containerID="8bfd8a8fe8f520c0bdd3a5164fe133a10f3e76f19d1c34103c42b1d9ab4fdfeb" exitCode=0 Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.560436 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" event={"ID":"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1","Type":"ContainerDied","Data":"8bfd8a8fe8f520c0bdd3a5164fe133a10f3e76f19d1c34103c42b1d9ab4fdfeb"} Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.703402 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.703980 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="ceilometer-central-agent" containerID="cri-o://f28b9a9b2e33861b2b8937e8a0acf07992031f2291a0da6c8fc53223704d8f50" gracePeriod=30 Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.704607 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="proxy-httpd" containerID="cri-o://b8916a65aaeb4f4e843c5fba061a08311e52e99052d791e323ff6941a73b7589" gracePeriod=30 Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.704670 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="sg-core" containerID="cri-o://1d04206c0d41b909492932943b574fcef26ed1b2dfcf90d669a67515dcaabab7" gracePeriod=30 Jan 29 17:09:15 crc kubenswrapper[4886]: I0129 17:09:15.704749 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="ceilometer-notification-agent" containerID="cri-o://8ed383dcd150e84a715deaf0b080e1c2f8bb3800fd02ff47edc2c3516be536cf" gracePeriod=30 Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.108565 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fznz7"] Jan 29 17:09:16 crc kubenswrapper[4886]: W0129 17:09:16.115946 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda88a08b7_d54a_4414_b7f6_b490949d6b70.slice/crio-0f300c9b5b26753aaff19219c045a650f2a2a1dbd8aa16dd9736b14b2cbcde2c WatchSource:0}: Error finding container 0f300c9b5b26753aaff19219c045a650f2a2a1dbd8aa16dd9736b14b2cbcde2c: Status 404 returned error can't find the container with id 0f300c9b5b26753aaff19219c045a650f2a2a1dbd8aa16dd9736b14b2cbcde2c Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.421731 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.442125 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.579832 4886 generic.go:334] "Generic (PLEG): container finished" podID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerID="b8916a65aaeb4f4e843c5fba061a08311e52e99052d791e323ff6941a73b7589" exitCode=0 Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.579889 4886 generic.go:334] "Generic (PLEG): container finished" podID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerID="1d04206c0d41b909492932943b574fcef26ed1b2dfcf90d669a67515dcaabab7" exitCode=2 Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.579903 4886 generic.go:334] "Generic (PLEG): container finished" podID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerID="8ed383dcd150e84a715deaf0b080e1c2f8bb3800fd02ff47edc2c3516be536cf" exitCode=0 Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.579972 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerDied","Data":"b8916a65aaeb4f4e843c5fba061a08311e52e99052d791e323ff6941a73b7589"} Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.580005 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerDied","Data":"1d04206c0d41b909492932943b574fcef26ed1b2dfcf90d669a67515dcaabab7"} Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.580020 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerDied","Data":"8ed383dcd150e84a715deaf0b080e1c2f8bb3800fd02ff47edc2c3516be536cf"} Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.581819 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" event={"ID":"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1","Type":"ContainerStarted","Data":"18dccc69ea12ffd53b4d4c8e312d9e5ee415348aafbce21b941019b15077a6b6"} Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.581947 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.583324 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fznz7" event={"ID":"a88a08b7-d54a-4414-b7f6-b490949d6b70","Type":"ContainerStarted","Data":"b0c7be4a8a6f220b0bc62ecd7ce7d07cb8b17e5644962c70a9a466af1717c6ce"} Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.583475 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fznz7" event={"ID":"a88a08b7-d54a-4414-b7f6-b490949d6b70","Type":"ContainerStarted","Data":"0f300c9b5b26753aaff19219c045a650f2a2a1dbd8aa16dd9736b14b2cbcde2c"} Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.608222 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" podStartSLOduration=4.6082033970000005 podStartE2EDuration="4.608203397s" podCreationTimestamp="2026-01-29 17:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:16.603805133 +0000 UTC m=+2839.512524405" watchObservedRunningTime="2026-01-29 17:09:16.608203397 +0000 UTC m=+2839.516922669" Jan 29 17:09:16 crc kubenswrapper[4886]: I0129 17:09:16.651770 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-fznz7" podStartSLOduration=1.651750424 podStartE2EDuration="1.651750424s" podCreationTimestamp="2026-01-29 17:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:16.635740903 +0000 UTC m=+2839.544460175" watchObservedRunningTime="2026-01-29 17:09:16.651750424 +0000 UTC m=+2839.560469696" Jan 29 17:09:19 crc kubenswrapper[4886]: I0129 17:09:19.680518 4886 generic.go:334] "Generic (PLEG): container finished" podID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerID="f28b9a9b2e33861b2b8937e8a0acf07992031f2291a0da6c8fc53223704d8f50" exitCode=0 Jan 29 17:09:19 crc kubenswrapper[4886]: I0129 17:09:19.681111 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerDied","Data":"f28b9a9b2e33861b2b8937e8a0acf07992031f2291a0da6c8fc53223704d8f50"} Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.010180 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.045110 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-sg-core-conf-yaml\") pod \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.045236 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk52z\" (UniqueName: \"kubernetes.io/projected/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-kube-api-access-fk52z\") pod \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.045268 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-run-httpd\") pod \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.045287 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-config-data\") pod \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.045359 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-combined-ca-bundle\") pod \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.045464 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-log-httpd\") pod \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.045510 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-scripts\") pod \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\" (UID: \"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c\") " Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.047659 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" (UID: "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.052015 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" (UID: "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.078241 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-kube-api-access-fk52z" (OuterVolumeSpecName: "kube-api-access-fk52z") pod "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" (UID: "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c"). InnerVolumeSpecName "kube-api-access-fk52z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.157299 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk52z\" (UniqueName: \"kubernetes.io/projected/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-kube-api-access-fk52z\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.157359 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.157371 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.273155 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-scripts" (OuterVolumeSpecName: "scripts") pod "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" (UID: "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.279005 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" (UID: "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.362269 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.365292 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.395445 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" (UID: "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.457637 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-config-data" (OuterVolumeSpecName: "config-data") pod "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" (UID: "e5fe8f3b-ae29-4a3c-be7a-a645f94d226c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.468088 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.468189 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.700609 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.700787 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5fe8f3b-ae29-4a3c-be7a-a645f94d226c","Type":"ContainerDied","Data":"ee2c96cf4752f271ab59c1e5d9ef8010edcb2061ecccd36a34d602bf9c8f1068"} Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.701302 4886 scope.go:117] "RemoveContainer" containerID="b8916a65aaeb4f4e843c5fba061a08311e52e99052d791e323ff6941a73b7589" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.703902 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3441bcd4-bf8b-406f-b3f5-1c723908bdc4","Type":"ContainerStarted","Data":"8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd"} Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.710978 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11","Type":"ContainerStarted","Data":"c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130"} Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.711129 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130" gracePeriod=30 Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.716602 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c24e1f4d-2c34-4496-bd90-4fe840552491","Type":"ContainerStarted","Data":"9ac610ed30cb05a5e2e84f376b3dae669cc45f85e6a0aacf8442be252f9695ce"} Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.716655 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c24e1f4d-2c34-4496-bd90-4fe840552491","Type":"ContainerStarted","Data":"b24f4f5a92565d88d3fd3da1badf8b5f1cb84c27bbc9afb1415ec3f58dd94565"} Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.725761 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63670887-1250-42df-a728-315414be9901","Type":"ContainerStarted","Data":"2706075df7ed398bfa86a5019c0c0b891534965545aed4044f6858df83babfa9"} Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.725820 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63670887-1250-42df-a728-315414be9901","Type":"ContainerStarted","Data":"3a64bd79066ba13789ce6be118a26c29652e1e5c788ad39a1b41f13dad0dd1c1"} Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.725987 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63670887-1250-42df-a728-315414be9901" containerName="nova-metadata-log" containerID="cri-o://3a64bd79066ba13789ce6be118a26c29652e1e5c788ad39a1b41f13dad0dd1c1" gracePeriod=30 Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.726252 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63670887-1250-42df-a728-315414be9901" containerName="nova-metadata-metadata" containerID="cri-o://2706075df7ed398bfa86a5019c0c0b891534965545aed4044f6858df83babfa9" gracePeriod=30 Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.731446 4886 scope.go:117] "RemoveContainer" containerID="1d04206c0d41b909492932943b574fcef26ed1b2dfcf90d669a67515dcaabab7" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.747727 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.384371438 podStartE2EDuration="8.747689038s" podCreationTimestamp="2026-01-29 17:09:12 +0000 UTC" firstStartedPulling="2026-01-29 17:09:14.125761628 +0000 UTC m=+2837.034480900" lastFinishedPulling="2026-01-29 17:09:19.489079228 +0000 UTC m=+2842.397798500" observedRunningTime="2026-01-29 17:09:20.72644403 +0000 UTC m=+2843.635163332" watchObservedRunningTime="2026-01-29 17:09:20.747689038 +0000 UTC m=+2843.656408310" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763036 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-6zh6p"] Jan 29 17:09:20 crc kubenswrapper[4886]: E0129 17:09:20.763479 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="ceilometer-notification-agent" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763498 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="ceilometer-notification-agent" Jan 29 17:09:20 crc kubenswrapper[4886]: E0129 17:09:20.763525 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="sg-core" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763533 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="sg-core" Jan 29 17:09:20 crc kubenswrapper[4886]: E0129 17:09:20.763559 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="ceilometer-central-agent" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763567 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="ceilometer-central-agent" Jan 29 17:09:20 crc kubenswrapper[4886]: E0129 17:09:20.763595 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="proxy-httpd" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763601 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="proxy-httpd" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763908 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="ceilometer-notification-agent" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763928 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="sg-core" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763940 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="ceilometer-central-agent" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.763958 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" containerName="proxy-httpd" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.765020 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.807209 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-6zh6p"] Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.822773 4886 scope.go:117] "RemoveContainer" containerID="8ed383dcd150e84a715deaf0b080e1c2f8bb3800fd02ff47edc2c3516be536cf" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.829870 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.309337944 podStartE2EDuration="8.829849272s" podCreationTimestamp="2026-01-29 17:09:12 +0000 UTC" firstStartedPulling="2026-01-29 17:09:13.967029437 +0000 UTC m=+2836.875748709" lastFinishedPulling="2026-01-29 17:09:19.487540765 +0000 UTC m=+2842.396260037" observedRunningTime="2026-01-29 17:09:20.751748883 +0000 UTC m=+2843.660468155" watchObservedRunningTime="2026-01-29 17:09:20.829849272 +0000 UTC m=+2843.738568544" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.849812 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.329726759 podStartE2EDuration="8.849791324s" podCreationTimestamp="2026-01-29 17:09:12 +0000 UTC" firstStartedPulling="2026-01-29 17:09:13.966091071 +0000 UTC m=+2836.874810343" lastFinishedPulling="2026-01-29 17:09:19.486155636 +0000 UTC m=+2842.394874908" observedRunningTime="2026-01-29 17:09:20.786768399 +0000 UTC m=+2843.695487681" watchObservedRunningTime="2026-01-29 17:09:20.849791324 +0000 UTC m=+2843.758510596" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.880027 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a490d-33e2-4411-8a77-c578f409ba28-operator-scripts\") pod \"aodh-db-create-6zh6p\" (UID: \"323a490d-33e2-4411-8a77-c578f409ba28\") " pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.880091 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mldf\" (UniqueName: \"kubernetes.io/projected/323a490d-33e2-4411-8a77-c578f409ba28-kube-api-access-5mldf\") pod \"aodh-db-create-6zh6p\" (UID: \"323a490d-33e2-4411-8a77-c578f409ba28\") " pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.928700 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.933671 4886 scope.go:117] "RemoveContainer" containerID="f28b9a9b2e33861b2b8937e8a0acf07992031f2291a0da6c8fc53223704d8f50" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.960088 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.973554 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-60d5-account-create-update-w67hv"] Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.976943 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.983390 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a490d-33e2-4411-8a77-c578f409ba28-operator-scripts\") pod \"aodh-db-create-6zh6p\" (UID: \"323a490d-33e2-4411-8a77-c578f409ba28\") " pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.983466 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mldf\" (UniqueName: \"kubernetes.io/projected/323a490d-33e2-4411-8a77-c578f409ba28-kube-api-access-5mldf\") pod \"aodh-db-create-6zh6p\" (UID: \"323a490d-33e2-4411-8a77-c578f409ba28\") " pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.984720 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a490d-33e2-4411-8a77-c578f409ba28-operator-scripts\") pod \"aodh-db-create-6zh6p\" (UID: \"323a490d-33e2-4411-8a77-c578f409ba28\") " pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:20 crc kubenswrapper[4886]: I0129 17:09:20.994833 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.000781 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-60d5-account-create-update-w67hv"] Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.005870 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.854569901 podStartE2EDuration="9.00585373s" podCreationTimestamp="2026-01-29 17:09:12 +0000 UTC" firstStartedPulling="2026-01-29 17:09:14.33957675 +0000 UTC m=+2837.248296022" lastFinishedPulling="2026-01-29 17:09:19.490860579 +0000 UTC m=+2842.399579851" observedRunningTime="2026-01-29 17:09:20.853581191 +0000 UTC m=+2843.762300473" watchObservedRunningTime="2026-01-29 17:09:21.00585373 +0000 UTC m=+2843.914573002" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.014067 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mldf\" (UniqueName: \"kubernetes.io/projected/323a490d-33e2-4411-8a77-c578f409ba28-kube-api-access-5mldf\") pod \"aodh-db-create-6zh6p\" (UID: \"323a490d-33e2-4411-8a77-c578f409ba28\") " pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.045384 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.048299 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.057644 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.057871 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.058634 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086355 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086432 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec6f2462-b78d-4619-9704-5cc67ae60974-operator-scripts\") pod \"aodh-60d5-account-create-update-w67hv\" (UID: \"ec6f2462-b78d-4619-9704-5cc67ae60974\") " pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086483 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086507 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-log-httpd\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086605 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-run-httpd\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086621 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbzsd\" (UniqueName: \"kubernetes.io/projected/ec6f2462-b78d-4619-9704-5cc67ae60974-kube-api-access-sbzsd\") pod \"aodh-60d5-account-create-update-w67hv\" (UID: \"ec6f2462-b78d-4619-9704-5cc67ae60974\") " pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086657 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-scripts\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086719 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjk8j\" (UniqueName: \"kubernetes.io/projected/295921c4-07ca-4972-a4fa-0a64f46855ec-kube-api-access-wjk8j\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.086737 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-config-data\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.115015 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205115 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjk8j\" (UniqueName: \"kubernetes.io/projected/295921c4-07ca-4972-a4fa-0a64f46855ec-kube-api-access-wjk8j\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205173 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-config-data\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205245 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205311 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec6f2462-b78d-4619-9704-5cc67ae60974-operator-scripts\") pod \"aodh-60d5-account-create-update-w67hv\" (UID: \"ec6f2462-b78d-4619-9704-5cc67ae60974\") " pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205386 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-log-httpd\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205411 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205558 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-run-httpd\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205585 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbzsd\" (UniqueName: \"kubernetes.io/projected/ec6f2462-b78d-4619-9704-5cc67ae60974-kube-api-access-sbzsd\") pod \"aodh-60d5-account-create-update-w67hv\" (UID: \"ec6f2462-b78d-4619-9704-5cc67ae60974\") " pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.205634 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-scripts\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.206510 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec6f2462-b78d-4619-9704-5cc67ae60974-operator-scripts\") pod \"aodh-60d5-account-create-update-w67hv\" (UID: \"ec6f2462-b78d-4619-9704-5cc67ae60974\") " pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.210846 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-run-httpd\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.211059 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-log-httpd\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.221337 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-config-data\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.223065 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.225410 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.226165 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-scripts\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.226934 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjk8j\" (UniqueName: \"kubernetes.io/projected/295921c4-07ca-4972-a4fa-0a64f46855ec-kube-api-access-wjk8j\") pod \"ceilometer-0\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.262813 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbzsd\" (UniqueName: \"kubernetes.io/projected/ec6f2462-b78d-4619-9704-5cc67ae60974-kube-api-access-sbzsd\") pod \"aodh-60d5-account-create-update-w67hv\" (UID: \"ec6f2462-b78d-4619-9704-5cc67ae60974\") " pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.297903 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.406679 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.648539 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-6zh6p"] Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.756364 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-6zh6p" event={"ID":"323a490d-33e2-4411-8a77-c578f409ba28","Type":"ContainerStarted","Data":"e0ac0de75a6d66b5b0eab6f8b648695440128eefa4a612dc2e8eeb54837d3d6c"} Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.761945 4886 generic.go:334] "Generic (PLEG): container finished" podID="63670887-1250-42df-a728-315414be9901" containerID="3a64bd79066ba13789ce6be118a26c29652e1e5c788ad39a1b41f13dad0dd1c1" exitCode=143 Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.761995 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63670887-1250-42df-a728-315414be9901","Type":"ContainerDied","Data":"3a64bd79066ba13789ce6be118a26c29652e1e5c788ad39a1b41f13dad0dd1c1"} Jan 29 17:09:21 crc kubenswrapper[4886]: I0129 17:09:21.845955 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-60d5-account-create-update-w67hv"] Jan 29 17:09:22 crc kubenswrapper[4886]: W0129 17:09:22.176503 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod295921c4_07ca_4972_a4fa_0a64f46855ec.slice/crio-a53c80ed86f57307186bc127fbed1c995aed2de96e312e93825a7c90882f5022 WatchSource:0}: Error finding container a53c80ed86f57307186bc127fbed1c995aed2de96e312e93825a7c90882f5022: Status 404 returned error can't find the container with id a53c80ed86f57307186bc127fbed1c995aed2de96e312e93825a7c90882f5022 Jan 29 17:09:22 crc kubenswrapper[4886]: I0129 17:09:22.177656 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:22 crc kubenswrapper[4886]: I0129 17:09:22.627756 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5fe8f3b-ae29-4a3c-be7a-a645f94d226c" path="/var/lib/kubelet/pods/e5fe8f3b-ae29-4a3c-be7a-a645f94d226c/volumes" Jan 29 17:09:22 crc kubenswrapper[4886]: I0129 17:09:22.780010 4886 generic.go:334] "Generic (PLEG): container finished" podID="ec6f2462-b78d-4619-9704-5cc67ae60974" containerID="94c431dc7f3dd6c3f091efc6b5f4191b950083388e1ef0390fd70fcd7a85128c" exitCode=0 Jan 29 17:09:22 crc kubenswrapper[4886]: I0129 17:09:22.780087 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-60d5-account-create-update-w67hv" event={"ID":"ec6f2462-b78d-4619-9704-5cc67ae60974","Type":"ContainerDied","Data":"94c431dc7f3dd6c3f091efc6b5f4191b950083388e1ef0390fd70fcd7a85128c"} Jan 29 17:09:22 crc kubenswrapper[4886]: I0129 17:09:22.780119 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-60d5-account-create-update-w67hv" event={"ID":"ec6f2462-b78d-4619-9704-5cc67ae60974","Type":"ContainerStarted","Data":"5d3677942b9ad8cac08ad6a8040413f4a4dafcf1a3ca405fc940d518718d37c9"} Jan 29 17:09:22 crc kubenswrapper[4886]: I0129 17:09:22.784374 4886 generic.go:334] "Generic (PLEG): container finished" podID="323a490d-33e2-4411-8a77-c578f409ba28" containerID="2e1c0eadae73024c2cb0f70a58a6f4f7d1a81518c1e179c7358b1ee70d254152" exitCode=0 Jan 29 17:09:22 crc kubenswrapper[4886]: I0129 17:09:22.784520 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-6zh6p" event={"ID":"323a490d-33e2-4411-8a77-c578f409ba28","Type":"ContainerDied","Data":"2e1c0eadae73024c2cb0f70a58a6f4f7d1a81518c1e179c7358b1ee70d254152"} Jan 29 17:09:22 crc kubenswrapper[4886]: I0129 17:09:22.786057 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerStarted","Data":"a53c80ed86f57307186bc127fbed1c995aed2de96e312e93825a7c90882f5022"} Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.032896 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.054528 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.109139 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.109194 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.126514 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.126549 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.167574 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.436486 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.444113 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.577873 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-btn45"] Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.578482 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" podUID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerName="dnsmasq-dns" containerID="cri-o://d9ab37d44f372064ee89522913b27477d9c2a6f3f0efeec33809e585d943fe38" gracePeriod=10 Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.820272 4886 generic.go:334] "Generic (PLEG): container finished" podID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerID="d9ab37d44f372064ee89522913b27477d9c2a6f3f0efeec33809e585d943fe38" exitCode=0 Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.820607 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" event={"ID":"da76d93d-7c2d-485e-b5e0-229f4254d74b","Type":"ContainerDied","Data":"d9ab37d44f372064ee89522913b27477d9c2a6f3f0efeec33809e585d943fe38"} Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.823793 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerStarted","Data":"0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707"} Jan 29 17:09:23 crc kubenswrapper[4886]: I0129 17:09:23.874209 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.114593 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.114592 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.515236 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.675067 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-sb\") pod \"da76d93d-7c2d-485e-b5e0-229f4254d74b\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.675186 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6k7c\" (UniqueName: \"kubernetes.io/projected/da76d93d-7c2d-485e-b5e0-229f4254d74b-kube-api-access-m6k7c\") pod \"da76d93d-7c2d-485e-b5e0-229f4254d74b\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.675464 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-config\") pod \"da76d93d-7c2d-485e-b5e0-229f4254d74b\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.675540 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-svc\") pod \"da76d93d-7c2d-485e-b5e0-229f4254d74b\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.675573 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-swift-storage-0\") pod \"da76d93d-7c2d-485e-b5e0-229f4254d74b\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.675654 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-nb\") pod \"da76d93d-7c2d-485e-b5e0-229f4254d74b\" (UID: \"da76d93d-7c2d-485e-b5e0-229f4254d74b\") " Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.752691 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da76d93d-7c2d-485e-b5e0-229f4254d74b-kube-api-access-m6k7c" (OuterVolumeSpecName: "kube-api-access-m6k7c") pod "da76d93d-7c2d-485e-b5e0-229f4254d74b" (UID: "da76d93d-7c2d-485e-b5e0-229f4254d74b"). InnerVolumeSpecName "kube-api-access-m6k7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.822887 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6k7c\" (UniqueName: \"kubernetes.io/projected/da76d93d-7c2d-485e-b5e0-229f4254d74b-kube-api-access-m6k7c\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.891799 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.895074 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-config" (OuterVolumeSpecName: "config") pod "da76d93d-7c2d-485e-b5e0-229f4254d74b" (UID: "da76d93d-7c2d-485e-b5e0-229f4254d74b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.906143 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerStarted","Data":"35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529"} Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.906854 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" event={"ID":"da76d93d-7c2d-485e-b5e0-229f4254d74b","Type":"ContainerDied","Data":"bfc495e69c05d32911e1c19e2fff095c3d4fca06c566554a8f30f63272e3f284"} Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.906905 4886 scope.go:117] "RemoveContainer" containerID="d9ab37d44f372064ee89522913b27477d9c2a6f3f0efeec33809e585d943fe38" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.915816 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-60d5-account-create-update-w67hv" event={"ID":"ec6f2462-b78d-4619-9704-5cc67ae60974","Type":"ContainerDied","Data":"5d3677942b9ad8cac08ad6a8040413f4a4dafcf1a3ca405fc940d518718d37c9"} Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.915874 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3677942b9ad8cac08ad6a8040413f4a4dafcf1a3ca405fc940d518718d37c9" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.930622 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.939855 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da76d93d-7c2d-485e-b5e0-229f4254d74b" (UID: "da76d93d-7c2d-485e-b5e0-229f4254d74b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.944311 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da76d93d-7c2d-485e-b5e0-229f4254d74b" (UID: "da76d93d-7c2d-485e-b5e0-229f4254d74b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.958981 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.965747 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.965842 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da76d93d-7c2d-485e-b5e0-229f4254d74b" (UID: "da76d93d-7c2d-485e-b5e0-229f4254d74b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.965956 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da76d93d-7c2d-485e-b5e0-229f4254d74b" (UID: "da76d93d-7c2d-485e-b5e0-229f4254d74b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:09:24 crc kubenswrapper[4886]: I0129 17:09:24.973117 4886 scope.go:117] "RemoveContainer" containerID="aecb755c349be6f445700545d32b2d2a1cceeb8e44ce0b32e7f93655d8a60679" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.034375 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.034451 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.034465 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.034505 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da76d93d-7c2d-485e-b5e0-229f4254d74b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.137025 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mldf\" (UniqueName: \"kubernetes.io/projected/323a490d-33e2-4411-8a77-c578f409ba28-kube-api-access-5mldf\") pod \"323a490d-33e2-4411-8a77-c578f409ba28\" (UID: \"323a490d-33e2-4411-8a77-c578f409ba28\") " Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.137089 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbzsd\" (UniqueName: \"kubernetes.io/projected/ec6f2462-b78d-4619-9704-5cc67ae60974-kube-api-access-sbzsd\") pod \"ec6f2462-b78d-4619-9704-5cc67ae60974\" (UID: \"ec6f2462-b78d-4619-9704-5cc67ae60974\") " Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.137262 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec6f2462-b78d-4619-9704-5cc67ae60974-operator-scripts\") pod \"ec6f2462-b78d-4619-9704-5cc67ae60974\" (UID: \"ec6f2462-b78d-4619-9704-5cc67ae60974\") " Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.137459 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a490d-33e2-4411-8a77-c578f409ba28-operator-scripts\") pod \"323a490d-33e2-4411-8a77-c578f409ba28\" (UID: \"323a490d-33e2-4411-8a77-c578f409ba28\") " Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.138363 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec6f2462-b78d-4619-9704-5cc67ae60974-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec6f2462-b78d-4619-9704-5cc67ae60974" (UID: "ec6f2462-b78d-4619-9704-5cc67ae60974"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.138531 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/323a490d-33e2-4411-8a77-c578f409ba28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "323a490d-33e2-4411-8a77-c578f409ba28" (UID: "323a490d-33e2-4411-8a77-c578f409ba28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.141549 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6f2462-b78d-4619-9704-5cc67ae60974-kube-api-access-sbzsd" (OuterVolumeSpecName: "kube-api-access-sbzsd") pod "ec6f2462-b78d-4619-9704-5cc67ae60974" (UID: "ec6f2462-b78d-4619-9704-5cc67ae60974"). InnerVolumeSpecName "kube-api-access-sbzsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.141728 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/323a490d-33e2-4411-8a77-c578f409ba28-kube-api-access-5mldf" (OuterVolumeSpecName: "kube-api-access-5mldf") pod "323a490d-33e2-4411-8a77-c578f409ba28" (UID: "323a490d-33e2-4411-8a77-c578f409ba28"). InnerVolumeSpecName "kube-api-access-5mldf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.237846 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-btn45"] Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.239180 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec6f2462-b78d-4619-9704-5cc67ae60974-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.239206 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a490d-33e2-4411-8a77-c578f409ba28-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.239218 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mldf\" (UniqueName: \"kubernetes.io/projected/323a490d-33e2-4411-8a77-c578f409ba28-kube-api-access-5mldf\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.239229 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbzsd\" (UniqueName: \"kubernetes.io/projected/ec6f2462-b78d-4619-9704-5cc67ae60974-kube-api-access-sbzsd\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.250896 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-btn45"] Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.936840 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerStarted","Data":"3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2"} Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.941408 4886 generic.go:334] "Generic (PLEG): container finished" podID="8cabf586-398a-45a9-80d6-2fd63d9e14e5" containerID="d6960d602147a760f370e0aaeba322f8c53999b050075e5ef6c33ecafc0b7928" exitCode=0 Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.941500 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tqcf4" event={"ID":"8cabf586-398a-45a9-80d6-2fd63d9e14e5","Type":"ContainerDied","Data":"d6960d602147a760f370e0aaeba322f8c53999b050075e5ef6c33ecafc0b7928"} Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.947635 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-60d5-account-create-update-w67hv" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.947663 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-6zh6p" Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.947712 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-6zh6p" event={"ID":"323a490d-33e2-4411-8a77-c578f409ba28","Type":"ContainerDied","Data":"e0ac0de75a6d66b5b0eab6f8b648695440128eefa4a612dc2e8eeb54837d3d6c"} Jan 29 17:09:25 crc kubenswrapper[4886]: I0129 17:09:25.947737 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ac0de75a6d66b5b0eab6f8b648695440128eefa4a612dc2e8eeb54837d3d6c" Jan 29 17:09:26 crc kubenswrapper[4886]: I0129 17:09:26.629927 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da76d93d-7c2d-485e-b5e0-229f4254d74b" path="/var/lib/kubelet/pods/da76d93d-7c2d-485e-b5e0-229f4254d74b/volumes" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.443690 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.609100 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhvmq\" (UniqueName: \"kubernetes.io/projected/8cabf586-398a-45a9-80d6-2fd63d9e14e5-kube-api-access-vhvmq\") pod \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.609267 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-combined-ca-bundle\") pod \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.609472 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-scripts\") pod \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.609596 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-config-data\") pod \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\" (UID: \"8cabf586-398a-45a9-80d6-2fd63d9e14e5\") " Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.626076 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cabf586-398a-45a9-80d6-2fd63d9e14e5-kube-api-access-vhvmq" (OuterVolumeSpecName: "kube-api-access-vhvmq") pod "8cabf586-398a-45a9-80d6-2fd63d9e14e5" (UID: "8cabf586-398a-45a9-80d6-2fd63d9e14e5"). InnerVolumeSpecName "kube-api-access-vhvmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.632559 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-scripts" (OuterVolumeSpecName: "scripts") pod "8cabf586-398a-45a9-80d6-2fd63d9e14e5" (UID: "8cabf586-398a-45a9-80d6-2fd63d9e14e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.668532 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-config-data" (OuterVolumeSpecName: "config-data") pod "8cabf586-398a-45a9-80d6-2fd63d9e14e5" (UID: "8cabf586-398a-45a9-80d6-2fd63d9e14e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.676510 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cabf586-398a-45a9-80d6-2fd63d9e14e5" (UID: "8cabf586-398a-45a9-80d6-2fd63d9e14e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.712772 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.712815 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.712837 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cabf586-398a-45a9-80d6-2fd63d9e14e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.712850 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhvmq\" (UniqueName: \"kubernetes.io/projected/8cabf586-398a-45a9-80d6-2fd63d9e14e5-kube-api-access-vhvmq\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.972309 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerStarted","Data":"63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef"} Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.972611 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.975554 4886 generic.go:334] "Generic (PLEG): container finished" podID="a88a08b7-d54a-4414-b7f6-b490949d6b70" containerID="b0c7be4a8a6f220b0bc62ecd7ce7d07cb8b17e5644962c70a9a466af1717c6ce" exitCode=0 Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.975642 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fznz7" event={"ID":"a88a08b7-d54a-4414-b7f6-b490949d6b70","Type":"ContainerDied","Data":"b0c7be4a8a6f220b0bc62ecd7ce7d07cb8b17e5644962c70a9a466af1717c6ce"} Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.978964 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tqcf4" event={"ID":"8cabf586-398a-45a9-80d6-2fd63d9e14e5","Type":"ContainerDied","Data":"c9ea59738c6ba35a7c3d3e2f05ce7750bd7b76ba456616dc38cec147840a905e"} Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.979001 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9ea59738c6ba35a7c3d3e2f05ce7750bd7b76ba456616dc38cec147840a905e" Jan 29 17:09:27 crc kubenswrapper[4886]: I0129 17:09:27.979060 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tqcf4" Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.004673 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.217312215 podStartE2EDuration="8.004649476s" podCreationTimestamp="2026-01-29 17:09:20 +0000 UTC" firstStartedPulling="2026-01-29 17:09:22.179245188 +0000 UTC m=+2845.087964460" lastFinishedPulling="2026-01-29 17:09:26.966582449 +0000 UTC m=+2849.875301721" observedRunningTime="2026-01-29 17:09:27.992167079 +0000 UTC m=+2850.900886381" watchObservedRunningTime="2026-01-29 17:09:28.004649476 +0000 UTC m=+2850.913368758" Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.179428 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.179690 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-log" containerID="cri-o://b24f4f5a92565d88d3fd3da1badf8b5f1cb84c27bbc9afb1415ec3f58dd94565" gracePeriod=30 Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.179826 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-api" containerID="cri-o://9ac610ed30cb05a5e2e84f376b3dae669cc45f85e6a0aacf8442be252f9695ce" gracePeriod=30 Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.216049 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.216580 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3441bcd4-bf8b-406f-b3f5-1c723908bdc4" containerName="nova-scheduler-scheduler" containerID="cri-o://8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd" gracePeriod=30 Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.990789 4886 generic.go:334] "Generic (PLEG): container finished" podID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerID="b24f4f5a92565d88d3fd3da1badf8b5f1cb84c27bbc9afb1415ec3f58dd94565" exitCode=143 Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.991009 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c24e1f4d-2c34-4496-bd90-4fe840552491","Type":"ContainerDied","Data":"b24f4f5a92565d88d3fd3da1badf8b5f1cb84c27bbc9afb1415ec3f58dd94565"} Jan 29 17:09:28 crc kubenswrapper[4886]: I0129 17:09:28.997715 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7756b9d78c-btn45" podUID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.227:5353: i/o timeout" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.468812 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.654525 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n78gf\" (UniqueName: \"kubernetes.io/projected/a88a08b7-d54a-4414-b7f6-b490949d6b70-kube-api-access-n78gf\") pod \"a88a08b7-d54a-4414-b7f6-b490949d6b70\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.654704 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-config-data\") pod \"a88a08b7-d54a-4414-b7f6-b490949d6b70\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.655554 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-combined-ca-bundle\") pod \"a88a08b7-d54a-4414-b7f6-b490949d6b70\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.655584 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-scripts\") pod \"a88a08b7-d54a-4414-b7f6-b490949d6b70\" (UID: \"a88a08b7-d54a-4414-b7f6-b490949d6b70\") " Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.661174 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a88a08b7-d54a-4414-b7f6-b490949d6b70-kube-api-access-n78gf" (OuterVolumeSpecName: "kube-api-access-n78gf") pod "a88a08b7-d54a-4414-b7f6-b490949d6b70" (UID: "a88a08b7-d54a-4414-b7f6-b490949d6b70"). InnerVolumeSpecName "kube-api-access-n78gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.662514 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-scripts" (OuterVolumeSpecName: "scripts") pod "a88a08b7-d54a-4414-b7f6-b490949d6b70" (UID: "a88a08b7-d54a-4414-b7f6-b490949d6b70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.689071 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-config-data" (OuterVolumeSpecName: "config-data") pod "a88a08b7-d54a-4414-b7f6-b490949d6b70" (UID: "a88a08b7-d54a-4414-b7f6-b490949d6b70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.717537 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a88a08b7-d54a-4414-b7f6-b490949d6b70" (UID: "a88a08b7-d54a-4414-b7f6-b490949d6b70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.758933 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n78gf\" (UniqueName: \"kubernetes.io/projected/a88a08b7-d54a-4414-b7f6-b490949d6b70-kube-api-access-n78gf\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.758969 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.758979 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:29 crc kubenswrapper[4886]: I0129 17:09:29.758990 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a08b7-d54a-4414-b7f6-b490949d6b70-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.002803 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fznz7" event={"ID":"a88a08b7-d54a-4414-b7f6-b490949d6b70","Type":"ContainerDied","Data":"0f300c9b5b26753aaff19219c045a650f2a2a1dbd8aa16dd9736b14b2cbcde2c"} Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.003145 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f300c9b5b26753aaff19219c045a650f2a2a1dbd8aa16dd9736b14b2cbcde2c" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.002856 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fznz7" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.122089 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 17:09:30 crc kubenswrapper[4886]: E0129 17:09:30.122700 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6f2462-b78d-4619-9704-5cc67ae60974" containerName="mariadb-account-create-update" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.122720 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6f2462-b78d-4619-9704-5cc67ae60974" containerName="mariadb-account-create-update" Jan 29 17:09:30 crc kubenswrapper[4886]: E0129 17:09:30.122735 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323a490d-33e2-4411-8a77-c578f409ba28" containerName="mariadb-database-create" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.122744 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="323a490d-33e2-4411-8a77-c578f409ba28" containerName="mariadb-database-create" Jan 29 17:09:30 crc kubenswrapper[4886]: E0129 17:09:30.122761 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88a08b7-d54a-4414-b7f6-b490949d6b70" containerName="nova-cell1-conductor-db-sync" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.122770 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88a08b7-d54a-4414-b7f6-b490949d6b70" containerName="nova-cell1-conductor-db-sync" Jan 29 17:09:30 crc kubenswrapper[4886]: E0129 17:09:30.122791 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerName="dnsmasq-dns" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.122799 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerName="dnsmasq-dns" Jan 29 17:09:30 crc kubenswrapper[4886]: E0129 17:09:30.122814 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerName="init" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.122822 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerName="init" Jan 29 17:09:30 crc kubenswrapper[4886]: E0129 17:09:30.122883 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cabf586-398a-45a9-80d6-2fd63d9e14e5" containerName="nova-manage" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.122893 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cabf586-398a-45a9-80d6-2fd63d9e14e5" containerName="nova-manage" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.123178 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cabf586-398a-45a9-80d6-2fd63d9e14e5" containerName="nova-manage" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.123203 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="da76d93d-7c2d-485e-b5e0-229f4254d74b" containerName="dnsmasq-dns" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.123216 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="323a490d-33e2-4411-8a77-c578f409ba28" containerName="mariadb-database-create" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.123236 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6f2462-b78d-4619-9704-5cc67ae60974" containerName="mariadb-account-create-update" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.123255 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a88a08b7-d54a-4414-b7f6-b490949d6b70" containerName="nova-cell1-conductor-db-sync" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.124291 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.133600 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.135663 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.275685 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5cnc\" (UniqueName: \"kubernetes.io/projected/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-kube-api-access-n5cnc\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.275751 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.275773 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.378030 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5cnc\" (UniqueName: \"kubernetes.io/projected/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-kube-api-access-n5cnc\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.378099 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.378121 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.396084 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.403154 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5cnc\" (UniqueName: \"kubernetes.io/projected/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-kube-api-access-n5cnc\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.417385 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08160d2e-8072-4d08-9dd2-4b5f256b6d9d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"08160d2e-8072-4d08-9dd2-4b5f256b6d9d\") " pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.462306 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.584133 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.685379 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-config-data\") pod \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.685912 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-combined-ca-bundle\") pod \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.686351 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dblx2\" (UniqueName: \"kubernetes.io/projected/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-kube-api-access-dblx2\") pod \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\" (UID: \"3441bcd4-bf8b-406f-b3f5-1c723908bdc4\") " Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.690041 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-kube-api-access-dblx2" (OuterVolumeSpecName: "kube-api-access-dblx2") pod "3441bcd4-bf8b-406f-b3f5-1c723908bdc4" (UID: "3441bcd4-bf8b-406f-b3f5-1c723908bdc4"). InnerVolumeSpecName "kube-api-access-dblx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.743099 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3441bcd4-bf8b-406f-b3f5-1c723908bdc4" (UID: "3441bcd4-bf8b-406f-b3f5-1c723908bdc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.781235 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-config-data" (OuterVolumeSpecName: "config-data") pod "3441bcd4-bf8b-406f-b3f5-1c723908bdc4" (UID: "3441bcd4-bf8b-406f-b3f5-1c723908bdc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.802625 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.802971 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dblx2\" (UniqueName: \"kubernetes.io/projected/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-kube-api-access-dblx2\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:30 crc kubenswrapper[4886]: I0129 17:09:30.802988 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3441bcd4-bf8b-406f-b3f5-1c723908bdc4-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.017747 4886 generic.go:334] "Generic (PLEG): container finished" podID="3441bcd4-bf8b-406f-b3f5-1c723908bdc4" containerID="8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd" exitCode=0 Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.017799 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3441bcd4-bf8b-406f-b3f5-1c723908bdc4","Type":"ContainerDied","Data":"8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd"} Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.017835 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3441bcd4-bf8b-406f-b3f5-1c723908bdc4","Type":"ContainerDied","Data":"95891069401cb7e43c836c472c728a63f5e1133c6a2287df2be68780c76d5016"} Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.017852 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.017863 4886 scope.go:117] "RemoveContainer" containerID="8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.053154 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.057101 4886 scope.go:117] "RemoveContainer" containerID="8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd" Jan 29 17:09:31 crc kubenswrapper[4886]: E0129 17:09:31.058036 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd\": container with ID starting with 8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd not found: ID does not exist" containerID="8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.058070 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd"} err="failed to get container status \"8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd\": rpc error: code = NotFound desc = could not find container \"8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd\": container with ID starting with 8808eab58f9c8adf5605704cca70ec0bf454f6f62d9777e76ad457d3030718bd not found: ID does not exist" Jan 29 17:09:31 crc kubenswrapper[4886]: W0129 17:09:31.066699 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08160d2e_8072_4d08_9dd2_4b5f256b6d9d.slice/crio-4d3c24bf2c92e30e5d04905392db5a16900f98de2ca897fc14f080b8ecc389fe WatchSource:0}: Error finding container 4d3c24bf2c92e30e5d04905392db5a16900f98de2ca897fc14f080b8ecc389fe: Status 404 returned error can't find the container with id 4d3c24bf2c92e30e5d04905392db5a16900f98de2ca897fc14f080b8ecc389fe Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.067880 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.089073 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.100969 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:31 crc kubenswrapper[4886]: E0129 17:09:31.101716 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3441bcd4-bf8b-406f-b3f5-1c723908bdc4" containerName="nova-scheduler-scheduler" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.101744 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3441bcd4-bf8b-406f-b3f5-1c723908bdc4" containerName="nova-scheduler-scheduler" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.102034 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3441bcd4-bf8b-406f-b3f5-1c723908bdc4" containerName="nova-scheduler-scheduler" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.103216 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.107632 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.112435 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.212919 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-config-data\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.213057 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.213092 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wgk9\" (UniqueName: \"kubernetes.io/projected/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-kube-api-access-4wgk9\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.316697 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-config-data\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.316800 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.316823 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wgk9\" (UniqueName: \"kubernetes.io/projected/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-kube-api-access-4wgk9\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.335082 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-config-data\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.338165 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wgk9\" (UniqueName: \"kubernetes.io/projected/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-kube-api-access-4wgk9\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.338513 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.428518 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:09:31 crc kubenswrapper[4886]: I0129 17:09:31.958140 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.032498 4886 generic.go:334] "Generic (PLEG): container finished" podID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerID="9ac610ed30cb05a5e2e84f376b3dae669cc45f85e6a0aacf8442be252f9695ce" exitCode=0 Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.032583 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c24e1f4d-2c34-4496-bd90-4fe840552491","Type":"ContainerDied","Data":"9ac610ed30cb05a5e2e84f376b3dae669cc45f85e6a0aacf8442be252f9695ce"} Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.037038 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"08160d2e-8072-4d08-9dd2-4b5f256b6d9d","Type":"ContainerStarted","Data":"1c0ce7463f78b041e9b29ab18be8908204e14cb1e5eea448e46f3eae3f631984"} Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.037070 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"08160d2e-8072-4d08-9dd2-4b5f256b6d9d","Type":"ContainerStarted","Data":"4d3c24bf2c92e30e5d04905392db5a16900f98de2ca897fc14f080b8ecc389fe"} Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.037336 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.038353 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd8b58c7-942f-4f89-88a0-ce374fd98f0b","Type":"ContainerStarted","Data":"c2ea7d41eadeb9e0900ac95c53b4acc74be8017115cf4e43325000be7c90063b"} Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.063506 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.063488024 podStartE2EDuration="2.063488024s" podCreationTimestamp="2026-01-29 17:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:32.049231076 +0000 UTC m=+2854.957950348" watchObservedRunningTime="2026-01-29 17:09:32.063488024 +0000 UTC m=+2854.972207296" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.113597 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.235452 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-combined-ca-bundle\") pod \"c24e1f4d-2c34-4496-bd90-4fe840552491\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.235527 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sdhg\" (UniqueName: \"kubernetes.io/projected/c24e1f4d-2c34-4496-bd90-4fe840552491-kube-api-access-5sdhg\") pod \"c24e1f4d-2c34-4496-bd90-4fe840552491\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.235610 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-config-data\") pod \"c24e1f4d-2c34-4496-bd90-4fe840552491\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.235816 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c24e1f4d-2c34-4496-bd90-4fe840552491-logs\") pod \"c24e1f4d-2c34-4496-bd90-4fe840552491\" (UID: \"c24e1f4d-2c34-4496-bd90-4fe840552491\") " Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.237237 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24e1f4d-2c34-4496-bd90-4fe840552491-logs" (OuterVolumeSpecName: "logs") pod "c24e1f4d-2c34-4496-bd90-4fe840552491" (UID: "c24e1f4d-2c34-4496-bd90-4fe840552491"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.239172 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24e1f4d-2c34-4496-bd90-4fe840552491-kube-api-access-5sdhg" (OuterVolumeSpecName: "kube-api-access-5sdhg") pod "c24e1f4d-2c34-4496-bd90-4fe840552491" (UID: "c24e1f4d-2c34-4496-bd90-4fe840552491"). InnerVolumeSpecName "kube-api-access-5sdhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.274793 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c24e1f4d-2c34-4496-bd90-4fe840552491" (UID: "c24e1f4d-2c34-4496-bd90-4fe840552491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.287562 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-config-data" (OuterVolumeSpecName: "config-data") pod "c24e1f4d-2c34-4496-bd90-4fe840552491" (UID: "c24e1f4d-2c34-4496-bd90-4fe840552491"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.339186 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.339547 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sdhg\" (UniqueName: \"kubernetes.io/projected/c24e1f4d-2c34-4496-bd90-4fe840552491-kube-api-access-5sdhg\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.339562 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e1f4d-2c34-4496-bd90-4fe840552491-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.339734 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c24e1f4d-2c34-4496-bd90-4fe840552491-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:32 crc kubenswrapper[4886]: I0129 17:09:32.633039 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3441bcd4-bf8b-406f-b3f5-1c723908bdc4" path="/var/lib/kubelet/pods/3441bcd4-bf8b-406f-b3f5-1c723908bdc4/volumes" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.052405 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd8b58c7-942f-4f89-88a0-ce374fd98f0b","Type":"ContainerStarted","Data":"9734db9b6c351c8b935d8796b19514bcaecf82f2265e11ccf340fb3e8e4c7834"} Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.054540 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c24e1f4d-2c34-4496-bd90-4fe840552491","Type":"ContainerDied","Data":"eb8a3baac4fbd0a80179f8a19f3f61fb9fca2e4d5dcfe096915c43ef69238e98"} Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.054568 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.054582 4886 scope.go:117] "RemoveContainer" containerID="9ac610ed30cb05a5e2e84f376b3dae669cc45f85e6a0aacf8442be252f9695ce" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.086877 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.08685442 podStartE2EDuration="2.08685442s" podCreationTimestamp="2026-01-29 17:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:33.07777601 +0000 UTC m=+2855.986495302" watchObservedRunningTime="2026-01-29 17:09:33.08685442 +0000 UTC m=+2855.995573692" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.095952 4886 scope.go:117] "RemoveContainer" containerID="b24f4f5a92565d88d3fd3da1badf8b5f1cb84c27bbc9afb1415ec3f58dd94565" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.109393 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.123407 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.136802 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:33 crc kubenswrapper[4886]: E0129 17:09:33.137266 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-log" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.137282 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-log" Jan 29 17:09:33 crc kubenswrapper[4886]: E0129 17:09:33.137345 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-api" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.137352 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-api" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.137539 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-log" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.137559 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" containerName="nova-api-api" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.144390 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.149898 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.178352 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.271123 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gwd6\" (UniqueName: \"kubernetes.io/projected/8c6e91d6-fc51-499e-b78b-00e296eac00d-kube-api-access-5gwd6\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.271598 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6e91d6-fc51-499e-b78b-00e296eac00d-logs\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.271753 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.272005 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-config-data\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.373844 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.374012 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-config-data\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.374113 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gwd6\" (UniqueName: \"kubernetes.io/projected/8c6e91d6-fc51-499e-b78b-00e296eac00d-kube-api-access-5gwd6\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.374160 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6e91d6-fc51-499e-b78b-00e296eac00d-logs\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.374656 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6e91d6-fc51-499e-b78b-00e296eac00d-logs\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.381080 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.382066 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-config-data\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.400661 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gwd6\" (UniqueName: \"kubernetes.io/projected/8c6e91d6-fc51-499e-b78b-00e296eac00d-kube-api-access-5gwd6\") pod \"nova-api-0\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.480213 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:09:33 crc kubenswrapper[4886]: I0129 17:09:33.981423 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:34 crc kubenswrapper[4886]: I0129 17:09:34.068395 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c6e91d6-fc51-499e-b78b-00e296eac00d","Type":"ContainerStarted","Data":"2e00cbff980509a81df06975ce0505dd9daf5a8bd0d230ec6e3bf51d83a43450"} Jan 29 17:09:34 crc kubenswrapper[4886]: I0129 17:09:34.633140 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24e1f4d-2c34-4496-bd90-4fe840552491" path="/var/lib/kubelet/pods/c24e1f4d-2c34-4496-bd90-4fe840552491/volumes" Jan 29 17:09:35 crc kubenswrapper[4886]: I0129 17:09:35.079858 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c6e91d6-fc51-499e-b78b-00e296eac00d","Type":"ContainerStarted","Data":"f7c0f51e04a1da68994cf51db97c7c851cff30a285cc4a371f750594853805ae"} Jan 29 17:09:35 crc kubenswrapper[4886]: I0129 17:09:35.080209 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c6e91d6-fc51-499e-b78b-00e296eac00d","Type":"ContainerStarted","Data":"b095c2996e7ff38f4d839b7c99b3243d8facce91df007a86d00bced397c851ce"} Jan 29 17:09:35 crc kubenswrapper[4886]: I0129 17:09:35.107232 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.10721183 podStartE2EDuration="2.10721183s" podCreationTimestamp="2026-01-29 17:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:35.10197172 +0000 UTC m=+2858.010691002" watchObservedRunningTime="2026-01-29 17:09:35.10721183 +0000 UTC m=+2858.015931102" Jan 29 17:09:36 crc kubenswrapper[4886]: I0129 17:09:36.429640 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 17:09:40 crc kubenswrapper[4886]: I0129 17:09:40.491155 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 29 17:09:41 crc kubenswrapper[4886]: I0129 17:09:41.429744 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 17:09:41 crc kubenswrapper[4886]: I0129 17:09:41.462973 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 17:09:42 crc kubenswrapper[4886]: I0129 17:09:42.225566 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 17:09:43 crc kubenswrapper[4886]: I0129 17:09:43.481199 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 17:09:43 crc kubenswrapper[4886]: I0129 17:09:43.481540 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 17:09:44 crc kubenswrapper[4886]: I0129 17:09:44.522538 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.8:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 17:09:44 crc kubenswrapper[4886]: I0129 17:09:44.563562 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.8:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.200736 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.320349 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-combined-ca-bundle\") pod \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.320526 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmffz\" (UniqueName: \"kubernetes.io/projected/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-kube-api-access-rmffz\") pod \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.320578 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-config-data\") pod \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\" (UID: \"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11\") " Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.325727 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-kube-api-access-rmffz" (OuterVolumeSpecName: "kube-api-access-rmffz") pod "cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" (UID: "cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11"). InnerVolumeSpecName "kube-api-access-rmffz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.335116 4886 generic.go:334] "Generic (PLEG): container finished" podID="63670887-1250-42df-a728-315414be9901" containerID="2706075df7ed398bfa86a5019c0c0b891534965545aed4044f6858df83babfa9" exitCode=137 Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.335196 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63670887-1250-42df-a728-315414be9901","Type":"ContainerDied","Data":"2706075df7ed398bfa86a5019c0c0b891534965545aed4044f6858df83babfa9"} Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.335277 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63670887-1250-42df-a728-315414be9901","Type":"ContainerDied","Data":"54233804a9ed5dc337d2e33b8c617c4a33e85a8e6af923aaf251e6cf9186b374"} Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.335291 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54233804a9ed5dc337d2e33b8c617c4a33e85a8e6af923aaf251e6cf9186b374" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.337998 4886 generic.go:334] "Generic (PLEG): container finished" podID="cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" containerID="c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130" exitCode=137 Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.338030 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11","Type":"ContainerDied","Data":"c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130"} Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.338063 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11","Type":"ContainerDied","Data":"b9417b27c0621c2b043b290e7d29fbfb8ed923b29824c45f4941d5924a3fcf00"} Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.338086 4886 scope.go:117] "RemoveContainer" containerID="c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.338102 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.345345 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.353100 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" (UID: "cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.357119 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-config-data" (OuterVolumeSpecName: "config-data") pod "cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" (UID: "cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.366680 4886 scope.go:117] "RemoveContainer" containerID="c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130" Jan 29 17:09:51 crc kubenswrapper[4886]: E0129 17:09:51.375535 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130\": container with ID starting with c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130 not found: ID does not exist" containerID="c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.375587 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130"} err="failed to get container status \"c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130\": rpc error: code = NotFound desc = could not find container \"c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130\": container with ID starting with c1835e2ae50e04a7c3dfeb3c6fd089c66709163b5092c57a8393b86cc24e0130 not found: ID does not exist" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.418427 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.421999 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-config-data\") pod \"63670887-1250-42df-a728-315414be9901\" (UID: \"63670887-1250-42df-a728-315414be9901\") " Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.422119 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63670887-1250-42df-a728-315414be9901-logs\") pod \"63670887-1250-42df-a728-315414be9901\" (UID: \"63670887-1250-42df-a728-315414be9901\") " Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.422220 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frqq5\" (UniqueName: \"kubernetes.io/projected/63670887-1250-42df-a728-315414be9901-kube-api-access-frqq5\") pod \"63670887-1250-42df-a728-315414be9901\" (UID: \"63670887-1250-42df-a728-315414be9901\") " Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.422241 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-combined-ca-bundle\") pod \"63670887-1250-42df-a728-315414be9901\" (UID: \"63670887-1250-42df-a728-315414be9901\") " Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.422506 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63670887-1250-42df-a728-315414be9901-logs" (OuterVolumeSpecName: "logs") pod "63670887-1250-42df-a728-315414be9901" (UID: "63670887-1250-42df-a728-315414be9901"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.423023 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63670887-1250-42df-a728-315414be9901-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.423047 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.423060 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmffz\" (UniqueName: \"kubernetes.io/projected/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-kube-api-access-rmffz\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.423073 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.426706 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63670887-1250-42df-a728-315414be9901-kube-api-access-frqq5" (OuterVolumeSpecName: "kube-api-access-frqq5") pod "63670887-1250-42df-a728-315414be9901" (UID: "63670887-1250-42df-a728-315414be9901"). InnerVolumeSpecName "kube-api-access-frqq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.455018 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63670887-1250-42df-a728-315414be9901" (UID: "63670887-1250-42df-a728-315414be9901"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.476442 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-config-data" (OuterVolumeSpecName: "config-data") pod "63670887-1250-42df-a728-315414be9901" (UID: "63670887-1250-42df-a728-315414be9901"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.525838 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frqq5\" (UniqueName: \"kubernetes.io/projected/63670887-1250-42df-a728-315414be9901-kube-api-access-frqq5\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.525871 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.525881 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63670887-1250-42df-a728-315414be9901-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.679496 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.692931 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.720217 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:51 crc kubenswrapper[4886]: E0129 17:09:51.720808 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63670887-1250-42df-a728-315414be9901" containerName="nova-metadata-log" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.720832 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="63670887-1250-42df-a728-315414be9901" containerName="nova-metadata-log" Jan 29 17:09:51 crc kubenswrapper[4886]: E0129 17:09:51.720862 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.720869 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 17:09:51 crc kubenswrapper[4886]: E0129 17:09:51.720888 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63670887-1250-42df-a728-315414be9901" containerName="nova-metadata-metadata" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.720895 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="63670887-1250-42df-a728-315414be9901" containerName="nova-metadata-metadata" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.721148 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="63670887-1250-42df-a728-315414be9901" containerName="nova-metadata-log" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.721159 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.721174 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="63670887-1250-42df-a728-315414be9901" containerName="nova-metadata-metadata" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.722001 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.728725 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.728842 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.729003 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.736250 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.831725 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.832081 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.832113 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.832144 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqwsr\" (UniqueName: \"kubernetes.io/projected/c2249ae5-133d-4750-9d7a-529dc8c9b39a-kube-api-access-jqwsr\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.832173 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.934271 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.934784 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.934967 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.935131 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwsr\" (UniqueName: \"kubernetes.io/projected/c2249ae5-133d-4750-9d7a-529dc8c9b39a-kube-api-access-jqwsr\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.935314 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.941676 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.942490 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.944480 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.945108 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2249ae5-133d-4750-9d7a-529dc8c9b39a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:51 crc kubenswrapper[4886]: I0129 17:09:51.963902 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwsr\" (UniqueName: \"kubernetes.io/projected/c2249ae5-133d-4750-9d7a-529dc8c9b39a-kube-api-access-jqwsr\") pod \"nova-cell1-novncproxy-0\" (UID: \"c2249ae5-133d-4750-9d7a-529dc8c9b39a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.047852 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.352197 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.409677 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.432505 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.443999 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.447340 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.450582 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.451423 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.470257 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.567345 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-config-data\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.567427 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-logs\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.567462 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8p2\" (UniqueName: \"kubernetes.io/projected/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-kube-api-access-dr8p2\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.567487 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.567904 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.570549 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 17:09:52 crc kubenswrapper[4886]: W0129 17:09:52.572822 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2249ae5_133d_4750_9d7a_529dc8c9b39a.slice/crio-6754113d5c90815a693b7e3bf97f1354a3d88b39a82b7947f77ce2319b1548f0 WatchSource:0}: Error finding container 6754113d5c90815a693b7e3bf97f1354a3d88b39a82b7947f77ce2319b1548f0: Status 404 returned error can't find the container with id 6754113d5c90815a693b7e3bf97f1354a3d88b39a82b7947f77ce2319b1548f0 Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.631573 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63670887-1250-42df-a728-315414be9901" path="/var/lib/kubelet/pods/63670887-1250-42df-a728-315414be9901/volumes" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.633167 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11" path="/var/lib/kubelet/pods/cb5b14f4-92b2-4f90-bfb8-1d00ab4c7e11/volumes" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.670891 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-config-data\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.671689 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-logs\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.671738 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8p2\" (UniqueName: \"kubernetes.io/projected/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-kube-api-access-dr8p2\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.671767 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.671899 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.673020 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-logs\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.678209 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.679539 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-config-data\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.679996 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.690530 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8p2\" (UniqueName: \"kubernetes.io/projected/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-kube-api-access-dr8p2\") pod \"nova-metadata-0\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " pod="openstack/nova-metadata-0" Jan 29 17:09:52 crc kubenswrapper[4886]: I0129 17:09:52.772617 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:09:53 crc kubenswrapper[4886]: W0129 17:09:53.285015 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba13f7f_cb9d_4147_9f9d_982bd5daac77.slice/crio-590686b9473f5c18e61b69cef7feee9a7b36c136560c55bdbbed141a70bc112d WatchSource:0}: Error finding container 590686b9473f5c18e61b69cef7feee9a7b36c136560c55bdbbed141a70bc112d: Status 404 returned error can't find the container with id 590686b9473f5c18e61b69cef7feee9a7b36c136560c55bdbbed141a70bc112d Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.287139 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.366115 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba13f7f-cb9d-4147-9f9d-982bd5daac77","Type":"ContainerStarted","Data":"590686b9473f5c18e61b69cef7feee9a7b36c136560c55bdbbed141a70bc112d"} Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.367705 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c2249ae5-133d-4750-9d7a-529dc8c9b39a","Type":"ContainerStarted","Data":"fc9c7d986999e5ce62132e547e3e1eb4f54671ebbd953c356d465c7357a314a3"} Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.367754 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c2249ae5-133d-4750-9d7a-529dc8c9b39a","Type":"ContainerStarted","Data":"6754113d5c90815a693b7e3bf97f1354a3d88b39a82b7947f77ce2319b1548f0"} Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.394051 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.394032638 podStartE2EDuration="2.394032638s" podCreationTimestamp="2026-01-29 17:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:53.391928848 +0000 UTC m=+2876.300648130" watchObservedRunningTime="2026-01-29 17:09:53.394032638 +0000 UTC m=+2876.302751910" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.488178 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.490841 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.494543 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.494916 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.689395 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b7z4z"] Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.692075 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.697407 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7z4z"] Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.801983 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-utilities\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.802747 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkxvc\" (UniqueName: \"kubernetes.io/projected/265d5adc-ace5-4008-99d5-206b5182e6d4-kube-api-access-xkxvc\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.802848 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-catalog-content\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.904374 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-utilities\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.904498 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkxvc\" (UniqueName: \"kubernetes.io/projected/265d5adc-ace5-4008-99d5-206b5182e6d4-kube-api-access-xkxvc\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.904550 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-catalog-content\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.904841 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-catalog-content\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.904896 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-utilities\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:53 crc kubenswrapper[4886]: I0129 17:09:53.934701 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkxvc\" (UniqueName: \"kubernetes.io/projected/265d5adc-ace5-4008-99d5-206b5182e6d4-kube-api-access-xkxvc\") pod \"community-operators-b7z4z\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.047005 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.384474 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba13f7f-cb9d-4147-9f9d-982bd5daac77","Type":"ContainerStarted","Data":"cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9"} Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.384909 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba13f7f-cb9d-4147-9f9d-982bd5daac77","Type":"ContainerStarted","Data":"5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc"} Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.386263 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.389868 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.412468 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.412444583 podStartE2EDuration="2.412444583s" podCreationTimestamp="2026-01-29 17:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:54.411449404 +0000 UTC m=+2877.320168666" watchObservedRunningTime="2026-01-29 17:09:54.412444583 +0000 UTC m=+2877.321163855" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.650688 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7z4z"] Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.712365 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-fh86h"] Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.715203 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.745345 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-fh86h"] Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.846717 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-config\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.846761 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.846782 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.846935 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.847001 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.847024 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcpq8\" (UniqueName: \"kubernetes.io/projected/efe27968-ef82-463a-8852-222528e7980d-kube-api-access-bcpq8\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.949738 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.950186 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.950210 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcpq8\" (UniqueName: \"kubernetes.io/projected/efe27968-ef82-463a-8852-222528e7980d-kube-api-access-bcpq8\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.950253 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-config\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.950269 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.950290 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.951011 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.951096 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.952569 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.953625 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-config\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.956476 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efe27968-ef82-463a-8852-222528e7980d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:54 crc kubenswrapper[4886]: I0129 17:09:54.987972 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcpq8\" (UniqueName: \"kubernetes.io/projected/efe27968-ef82-463a-8852-222528e7980d-kube-api-access-bcpq8\") pod \"dnsmasq-dns-6b7bbf7cf9-fh86h\" (UID: \"efe27968-ef82-463a-8852-222528e7980d\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:55 crc kubenswrapper[4886]: I0129 17:09:55.071739 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:55 crc kubenswrapper[4886]: I0129 17:09:55.399030 4886 generic.go:334] "Generic (PLEG): container finished" podID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerID="c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368" exitCode=0 Jan 29 17:09:55 crc kubenswrapper[4886]: I0129 17:09:55.399789 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7z4z" event={"ID":"265d5adc-ace5-4008-99d5-206b5182e6d4","Type":"ContainerDied","Data":"c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368"} Jan 29 17:09:55 crc kubenswrapper[4886]: I0129 17:09:55.399845 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7z4z" event={"ID":"265d5adc-ace5-4008-99d5-206b5182e6d4","Type":"ContainerStarted","Data":"b49a773367da81a381e19a2ba4ecf2f2565cbe6beacc718a457751390e647a71"} Jan 29 17:09:55 crc kubenswrapper[4886]: I0129 17:09:55.401837 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:09:55 crc kubenswrapper[4886]: I0129 17:09:55.626110 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-fh86h"] Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.410811 4886 generic.go:334] "Generic (PLEG): container finished" podID="efe27968-ef82-463a-8852-222528e7980d" containerID="8e8f92d48ecc2d99355334d6891f6a7a18b5bf8604dbd8b2719327472baa935c" exitCode=0 Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.410895 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" event={"ID":"efe27968-ef82-463a-8852-222528e7980d","Type":"ContainerDied","Data":"8e8f92d48ecc2d99355334d6891f6a7a18b5bf8604dbd8b2719327472baa935c"} Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.411195 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" event={"ID":"efe27968-ef82-463a-8852-222528e7980d","Type":"ContainerStarted","Data":"5c71118f414dec8188ace8063b50692f92c8e5698781b6464b0323ed841eca32"} Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.415250 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7z4z" event={"ID":"265d5adc-ace5-4008-99d5-206b5182e6d4","Type":"ContainerStarted","Data":"3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173"} Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.630543 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.631029 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="proxy-httpd" containerID="cri-o://63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef" gracePeriod=30 Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.631108 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="sg-core" containerID="cri-o://3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2" gracePeriod=30 Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.631028 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="ceilometer-central-agent" containerID="cri-o://0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707" gracePeriod=30 Jan 29 17:09:56 crc kubenswrapper[4886]: I0129 17:09:56.631195 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="ceilometer-notification-agent" containerID="cri-o://35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529" gracePeriod=30 Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.048716 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.431061 4886 generic.go:334] "Generic (PLEG): container finished" podID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerID="63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef" exitCode=0 Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.431097 4886 generic.go:334] "Generic (PLEG): container finished" podID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerID="3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2" exitCode=2 Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.431106 4886 generic.go:334] "Generic (PLEG): container finished" podID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerID="0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707" exitCode=0 Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.431135 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerDied","Data":"63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef"} Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.431189 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerDied","Data":"3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2"} Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.431202 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerDied","Data":"0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707"} Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.433432 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" event={"ID":"efe27968-ef82-463a-8852-222528e7980d","Type":"ContainerStarted","Data":"1e10af47f9cb65f41c613b3888f9ea857bb52e7733a459e738c1fe3fa046d41a"} Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.433995 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.456011 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" podStartSLOduration=3.455985915 podStartE2EDuration="3.455985915s" podCreationTimestamp="2026-01-29 17:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:09:57.451852686 +0000 UTC m=+2880.360571958" watchObservedRunningTime="2026-01-29 17:09:57.455985915 +0000 UTC m=+2880.364705187" Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.655455 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.656097 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-log" containerID="cri-o://b095c2996e7ff38f4d839b7c99b3243d8facce91df007a86d00bced397c851ce" gracePeriod=30 Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.656162 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-api" containerID="cri-o://f7c0f51e04a1da68994cf51db97c7c851cff30a285cc4a371f750594853805ae" gracePeriod=30 Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.773462 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 17:09:57 crc kubenswrapper[4886]: I0129 17:09:57.773929 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.107982 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.250241 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-log-httpd\") pod \"295921c4-07ca-4972-a4fa-0a64f46855ec\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.250898 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-combined-ca-bundle\") pod \"295921c4-07ca-4972-a4fa-0a64f46855ec\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.251154 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-sg-core-conf-yaml\") pod \"295921c4-07ca-4972-a4fa-0a64f46855ec\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.251160 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "295921c4-07ca-4972-a4fa-0a64f46855ec" (UID: "295921c4-07ca-4972-a4fa-0a64f46855ec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.251246 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjk8j\" (UniqueName: \"kubernetes.io/projected/295921c4-07ca-4972-a4fa-0a64f46855ec-kube-api-access-wjk8j\") pod \"295921c4-07ca-4972-a4fa-0a64f46855ec\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.251397 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-run-httpd\") pod \"295921c4-07ca-4972-a4fa-0a64f46855ec\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.251463 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-config-data\") pod \"295921c4-07ca-4972-a4fa-0a64f46855ec\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.251534 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-scripts\") pod \"295921c4-07ca-4972-a4fa-0a64f46855ec\" (UID: \"295921c4-07ca-4972-a4fa-0a64f46855ec\") " Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.252244 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "295921c4-07ca-4972-a4fa-0a64f46855ec" (UID: "295921c4-07ca-4972-a4fa-0a64f46855ec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.253194 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.253234 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/295921c4-07ca-4972-a4fa-0a64f46855ec-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.257988 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295921c4-07ca-4972-a4fa-0a64f46855ec-kube-api-access-wjk8j" (OuterVolumeSpecName: "kube-api-access-wjk8j") pod "295921c4-07ca-4972-a4fa-0a64f46855ec" (UID: "295921c4-07ca-4972-a4fa-0a64f46855ec"). InnerVolumeSpecName "kube-api-access-wjk8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.263509 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-scripts" (OuterVolumeSpecName: "scripts") pod "295921c4-07ca-4972-a4fa-0a64f46855ec" (UID: "295921c4-07ca-4972-a4fa-0a64f46855ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.308452 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "295921c4-07ca-4972-a4fa-0a64f46855ec" (UID: "295921c4-07ca-4972-a4fa-0a64f46855ec"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.355860 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.355897 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.355915 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjk8j\" (UniqueName: \"kubernetes.io/projected/295921c4-07ca-4972-a4fa-0a64f46855ec-kube-api-access-wjk8j\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.363611 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "295921c4-07ca-4972-a4fa-0a64f46855ec" (UID: "295921c4-07ca-4972-a4fa-0a64f46855ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.393409 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-config-data" (OuterVolumeSpecName: "config-data") pod "295921c4-07ca-4972-a4fa-0a64f46855ec" (UID: "295921c4-07ca-4972-a4fa-0a64f46855ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.448292 4886 generic.go:334] "Generic (PLEG): container finished" podID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerID="35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529" exitCode=0 Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.448382 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerDied","Data":"35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529"} Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.448400 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.448425 4886 scope.go:117] "RemoveContainer" containerID="63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.448413 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"295921c4-07ca-4972-a4fa-0a64f46855ec","Type":"ContainerDied","Data":"a53c80ed86f57307186bc127fbed1c995aed2de96e312e93825a7c90882f5022"} Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.452581 4886 generic.go:334] "Generic (PLEG): container finished" podID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerID="3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173" exitCode=0 Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.452653 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7z4z" event={"ID":"265d5adc-ace5-4008-99d5-206b5182e6d4","Type":"ContainerDied","Data":"3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173"} Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.458275 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.458304 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295921c4-07ca-4972-a4fa-0a64f46855ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.458486 4886 generic.go:334] "Generic (PLEG): container finished" podID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerID="b095c2996e7ff38f4d839b7c99b3243d8facce91df007a86d00bced397c851ce" exitCode=143 Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.458547 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c6e91d6-fc51-499e-b78b-00e296eac00d","Type":"ContainerDied","Data":"b095c2996e7ff38f4d839b7c99b3243d8facce91df007a86d00bced397c851ce"} Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.488117 4886 scope.go:117] "RemoveContainer" containerID="3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.529319 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.529768 4886 scope.go:117] "RemoveContainer" containerID="35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.546244 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.577905 4886 scope.go:117] "RemoveContainer" containerID="0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.579072 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:58 crc kubenswrapper[4886]: E0129 17:09:58.579946 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="ceilometer-notification-agent" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.579965 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="ceilometer-notification-agent" Jan 29 17:09:58 crc kubenswrapper[4886]: E0129 17:09:58.580004 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="proxy-httpd" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.580011 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="proxy-httpd" Jan 29 17:09:58 crc kubenswrapper[4886]: E0129 17:09:58.580038 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="ceilometer-central-agent" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.580044 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="ceilometer-central-agent" Jan 29 17:09:58 crc kubenswrapper[4886]: E0129 17:09:58.580059 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="sg-core" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.580066 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="sg-core" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.586720 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="ceilometer-central-agent" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.586765 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="sg-core" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.586799 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="ceilometer-notification-agent" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.586817 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" containerName="proxy-httpd" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.600657 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.603703 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.607366 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.624805 4886 scope.go:117] "RemoveContainer" containerID="63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef" Jan 29 17:09:58 crc kubenswrapper[4886]: E0129 17:09:58.625935 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef\": container with ID starting with 63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef not found: ID does not exist" containerID="63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.625975 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef"} err="failed to get container status \"63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef\": rpc error: code = NotFound desc = could not find container \"63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef\": container with ID starting with 63a6dbf76c0560d2045aa913e46fcd8eb27522f3a2df8c23f4d345a42f6982ef not found: ID does not exist" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.626002 4886 scope.go:117] "RemoveContainer" containerID="3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2" Jan 29 17:09:58 crc kubenswrapper[4886]: E0129 17:09:58.626460 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2\": container with ID starting with 3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2 not found: ID does not exist" containerID="3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.626537 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2"} err="failed to get container status \"3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2\": rpc error: code = NotFound desc = could not find container \"3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2\": container with ID starting with 3856ce84dbdc829026cdc077123a144ae1db22ed2ef5daec2a2a38e79ea5fff2 not found: ID does not exist" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.626572 4886 scope.go:117] "RemoveContainer" containerID="35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529" Jan 29 17:09:58 crc kubenswrapper[4886]: E0129 17:09:58.626990 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529\": container with ID starting with 35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529 not found: ID does not exist" containerID="35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.627019 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529"} err="failed to get container status \"35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529\": rpc error: code = NotFound desc = could not find container \"35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529\": container with ID starting with 35e24ed99f8fd2890904f1ca37992a754b300543953f2f3061639a8631f92529 not found: ID does not exist" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.627036 4886 scope.go:117] "RemoveContainer" containerID="0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707" Jan 29 17:09:58 crc kubenswrapper[4886]: E0129 17:09:58.629784 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707\": container with ID starting with 0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707 not found: ID does not exist" containerID="0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.629820 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707"} err="failed to get container status \"0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707\": rpc error: code = NotFound desc = could not find container \"0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707\": container with ID starting with 0b0960c021f6fe492666e7a5f8550203f34c505c88a04448efdf009572fba707 not found: ID does not exist" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.647447 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295921c4-07ca-4972-a4fa-0a64f46855ec" path="/var/lib/kubelet/pods/295921c4-07ca-4972-a4fa-0a64f46855ec/volumes" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.648218 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.771114 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.771468 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-run-httpd\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.771531 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-scripts\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.771618 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-config-data\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.771671 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-log-httpd\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.771735 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.771767 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcsqj\" (UniqueName: \"kubernetes.io/projected/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-kube-api-access-fcsqj\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.873532 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-config-data\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.873618 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-log-httpd\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.873707 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.873751 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcsqj\" (UniqueName: \"kubernetes.io/projected/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-kube-api-access-fcsqj\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.873769 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.873793 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-run-httpd\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.873835 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-scripts\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.874159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-log-httpd\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.875038 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-run-httpd\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.879044 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.879122 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-scripts\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.879307 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.879740 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-config-data\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.890658 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcsqj\" (UniqueName: \"kubernetes.io/projected/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-kube-api-access-fcsqj\") pod \"ceilometer-0\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " pod="openstack/ceilometer-0" Jan 29 17:09:58 crc kubenswrapper[4886]: I0129 17:09:58.930979 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:09:59 crc kubenswrapper[4886]: I0129 17:09:59.231883 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:09:59 crc kubenswrapper[4886]: I0129 17:09:59.528943 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:00 crc kubenswrapper[4886]: I0129 17:10:00.487121 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7z4z" event={"ID":"265d5adc-ace5-4008-99d5-206b5182e6d4","Type":"ContainerStarted","Data":"4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede"} Jan 29 17:10:00 crc kubenswrapper[4886]: I0129 17:10:00.490088 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerStarted","Data":"72d7fa6925704b9669a07a61d5a64685973e8bd1e0037e203f9d28200da940d5"} Jan 29 17:10:00 crc kubenswrapper[4886]: I0129 17:10:00.531841 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b7z4z" podStartSLOduration=4.043282847 podStartE2EDuration="7.531816239s" podCreationTimestamp="2026-01-29 17:09:53 +0000 UTC" firstStartedPulling="2026-01-29 17:09:55.401533259 +0000 UTC m=+2878.310252531" lastFinishedPulling="2026-01-29 17:09:58.890066651 +0000 UTC m=+2881.798785923" observedRunningTime="2026-01-29 17:10:00.510964543 +0000 UTC m=+2883.419683915" watchObservedRunningTime="2026-01-29 17:10:00.531816239 +0000 UTC m=+2883.440535551" Jan 29 17:10:01 crc kubenswrapper[4886]: I0129 17:10:01.514494 4886 generic.go:334] "Generic (PLEG): container finished" podID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerID="f7c0f51e04a1da68994cf51db97c7c851cff30a285cc4a371f750594853805ae" exitCode=0 Jan 29 17:10:01 crc kubenswrapper[4886]: I0129 17:10:01.514566 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c6e91d6-fc51-499e-b78b-00e296eac00d","Type":"ContainerDied","Data":"f7c0f51e04a1da68994cf51db97c7c851cff30a285cc4a371f750594853805ae"} Jan 29 17:10:01 crc kubenswrapper[4886]: I0129 17:10:01.927481 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.048759 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.059742 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-combined-ca-bundle\") pod \"8c6e91d6-fc51-499e-b78b-00e296eac00d\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.059795 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6e91d6-fc51-499e-b78b-00e296eac00d-logs\") pod \"8c6e91d6-fc51-499e-b78b-00e296eac00d\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.060558 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-config-data\") pod \"8c6e91d6-fc51-499e-b78b-00e296eac00d\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.060604 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gwd6\" (UniqueName: \"kubernetes.io/projected/8c6e91d6-fc51-499e-b78b-00e296eac00d-kube-api-access-5gwd6\") pod \"8c6e91d6-fc51-499e-b78b-00e296eac00d\" (UID: \"8c6e91d6-fc51-499e-b78b-00e296eac00d\") " Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.060682 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c6e91d6-fc51-499e-b78b-00e296eac00d-logs" (OuterVolumeSpecName: "logs") pod "8c6e91d6-fc51-499e-b78b-00e296eac00d" (UID: "8c6e91d6-fc51-499e-b78b-00e296eac00d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.061854 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6e91d6-fc51-499e-b78b-00e296eac00d-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.065722 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6e91d6-fc51-499e-b78b-00e296eac00d-kube-api-access-5gwd6" (OuterVolumeSpecName: "kube-api-access-5gwd6") pod "8c6e91d6-fc51-499e-b78b-00e296eac00d" (UID: "8c6e91d6-fc51-499e-b78b-00e296eac00d"). InnerVolumeSpecName "kube-api-access-5gwd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.104069 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.107607 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-config-data" (OuterVolumeSpecName: "config-data") pod "8c6e91d6-fc51-499e-b78b-00e296eac00d" (UID: "8c6e91d6-fc51-499e-b78b-00e296eac00d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.110124 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c6e91d6-fc51-499e-b78b-00e296eac00d" (UID: "8c6e91d6-fc51-499e-b78b-00e296eac00d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.164155 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.164198 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gwd6\" (UniqueName: \"kubernetes.io/projected/8c6e91d6-fc51-499e-b78b-00e296eac00d-kube-api-access-5gwd6\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.164211 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6e91d6-fc51-499e-b78b-00e296eac00d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.536925 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c6e91d6-fc51-499e-b78b-00e296eac00d","Type":"ContainerDied","Data":"2e00cbff980509a81df06975ce0505dd9daf5a8bd0d230ec6e3bf51d83a43450"} Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.536990 4886 scope.go:117] "RemoveContainer" containerID="f7c0f51e04a1da68994cf51db97c7c851cff30a285cc4a371f750594853805ae" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.537120 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.543740 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerStarted","Data":"91baaab9d9528ba788b818d15e20639e4d6e2fffc89317503dfc698ecdb0a06c"} Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.572527 4886 scope.go:117] "RemoveContainer" containerID="b095c2996e7ff38f4d839b7c99b3243d8facce91df007a86d00bced397c851ce" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.591462 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.653194 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.653242 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.653265 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:02 crc kubenswrapper[4886]: E0129 17:10:02.657158 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-log" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.657184 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-log" Jan 29 17:10:02 crc kubenswrapper[4886]: E0129 17:10:02.657211 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-api" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.657219 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-api" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.657440 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-log" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.657458 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" containerName="nova-api-api" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.671797 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.671927 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.675870 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.676096 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.679922 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.773606 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.773643 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.781613 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.781703 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r59zt\" (UniqueName: \"kubernetes.io/projected/b515f59a-4b3a-4821-bbec-8e622a8164e6-kube-api-access-r59zt\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.781744 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-config-data\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.781780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.781810 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b515f59a-4b3a-4821-bbec-8e622a8164e6-logs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.781905 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.883435 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.883568 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.883592 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r59zt\" (UniqueName: \"kubernetes.io/projected/b515f59a-4b3a-4821-bbec-8e622a8164e6-kube-api-access-r59zt\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.883651 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-config-data\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.883690 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.883725 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b515f59a-4b3a-4821-bbec-8e622a8164e6-logs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.886491 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b515f59a-4b3a-4821-bbec-8e622a8164e6-logs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.891236 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.891709 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.916901 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.920069 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-config-data\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:02 crc kubenswrapper[4886]: I0129 17:10:02.922594 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r59zt\" (UniqueName: \"kubernetes.io/projected/b515f59a-4b3a-4821-bbec-8e622a8164e6-kube-api-access-r59zt\") pod \"nova-api-0\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " pod="openstack/nova-api-0" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.034447 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ddfqz"] Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.037054 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.056097 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.056463 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.077467 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.120202 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ddfqz"] Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.232735 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxx4\" (UniqueName: \"kubernetes.io/projected/7a1c51cd-f91d-406b-815c-00879a9d6401-kube-api-access-xwxx4\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.232846 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.232918 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-scripts\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.233009 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-config-data\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.334872 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-scripts\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.335049 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-config-data\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.335173 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxx4\" (UniqueName: \"kubernetes.io/projected/7a1c51cd-f91d-406b-815c-00879a9d6401-kube-api-access-xwxx4\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.335221 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.345876 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-scripts\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.346184 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.347026 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-config-data\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.358180 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxx4\" (UniqueName: \"kubernetes.io/projected/7a1c51cd-f91d-406b-815c-00879a9d6401-kube-api-access-xwxx4\") pod \"nova-cell1-cell-mapping-ddfqz\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.432258 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.594847 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerStarted","Data":"d4c2814ebfa5456f9a32d52477ed9133aa03f6c310e426d0feadc41c2659a8a9"} Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.723635 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.800602 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.10:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 17:10:03 crc kubenswrapper[4886]: I0129 17:10:03.800758 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.10:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.050479 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.050521 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.088613 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ddfqz"] Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.663110 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6e91d6-fc51-499e-b78b-00e296eac00d" path="/var/lib/kubelet/pods/8c6e91d6-fc51-499e-b78b-00e296eac00d/volumes" Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.673952 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerStarted","Data":"189370fd8336eb715dd7e8e4fbb1c1dcacac0f2820ddab52e349e5fc03b6bbea"} Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.683104 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ddfqz" event={"ID":"7a1c51cd-f91d-406b-815c-00879a9d6401","Type":"ContainerStarted","Data":"5be86521758fe7c03f20fd8b758e10774f421701b95693128fa47b2a2e5adc70"} Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.683150 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ddfqz" event={"ID":"7a1c51cd-f91d-406b-815c-00879a9d6401","Type":"ContainerStarted","Data":"f1662f2f91761a984c86477b3a390f7b3bd8f222aea924e68ce2bb82b98bbf96"} Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.705855 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b515f59a-4b3a-4821-bbec-8e622a8164e6","Type":"ContainerStarted","Data":"297512a17905e8884ba2dee2e1bd0e97f5fbde7e67ab2e041189401e3a8b1069"} Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.705900 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b515f59a-4b3a-4821-bbec-8e622a8164e6","Type":"ContainerStarted","Data":"3b5aab9a83beedb9411f1928c81b699649b72f9a5c36a34dc864ad27dbc02c85"} Jan 29 17:10:04 crc kubenswrapper[4886]: I0129 17:10:04.718614 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ddfqz" podStartSLOduration=2.718596256 podStartE2EDuration="2.718596256s" podCreationTimestamp="2026-01-29 17:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:10:04.714815358 +0000 UTC m=+2887.623534630" watchObservedRunningTime="2026-01-29 17:10:04.718596256 +0000 UTC m=+2887.627315528" Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.073503 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fh86h" Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.125257 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b7z4z" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="registry-server" probeResult="failure" output=< Jan 29 17:10:05 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:10:05 crc kubenswrapper[4886]: > Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.176407 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zdbgk"] Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.176758 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" podUID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" containerName="dnsmasq-dns" containerID="cri-o://18dccc69ea12ffd53b4d4c8e312d9e5ee415348aafbce21b941019b15077a6b6" gracePeriod=10 Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.735068 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b515f59a-4b3a-4821-bbec-8e622a8164e6","Type":"ContainerStarted","Data":"5279babaff011b0a7c0724784680ba960a9fce4465f977efe275f3b290d89fab"} Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.739592 4886 generic.go:334] "Generic (PLEG): container finished" podID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" containerID="18dccc69ea12ffd53b4d4c8e312d9e5ee415348aafbce21b941019b15077a6b6" exitCode=0 Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.739659 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" event={"ID":"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1","Type":"ContainerDied","Data":"18dccc69ea12ffd53b4d4c8e312d9e5ee415348aafbce21b941019b15077a6b6"} Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.788506 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.788480844 podStartE2EDuration="3.788480844s" podCreationTimestamp="2026-01-29 17:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:10:05.769703587 +0000 UTC m=+2888.678422859" watchObservedRunningTime="2026-01-29 17:10:05.788480844 +0000 UTC m=+2888.697200126" Jan 29 17:10:05 crc kubenswrapper[4886]: I0129 17:10:05.907958 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.051926 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-sb\") pod \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.052037 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-config\") pod \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.052080 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-swift-storage-0\") pod \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.052130 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-svc\") pod \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.052157 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-nb\") pod \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.052229 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9csz\" (UniqueName: \"kubernetes.io/projected/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-kube-api-access-x9csz\") pod \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\" (UID: \"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1\") " Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.061579 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-kube-api-access-x9csz" (OuterVolumeSpecName: "kube-api-access-x9csz") pod "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" (UID: "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1"). InnerVolumeSpecName "kube-api-access-x9csz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.129548 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" (UID: "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.140640 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-config" (OuterVolumeSpecName: "config") pod "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" (UID: "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.155935 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.155973 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9csz\" (UniqueName: \"kubernetes.io/projected/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-kube-api-access-x9csz\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.155987 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-config\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.157868 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" (UID: "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.206766 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" (UID: "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.209741 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" (UID: "8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.258857 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.258889 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.258898 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.752904 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" event={"ID":"8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1","Type":"ContainerDied","Data":"f636861581833a86368762de32a4ca62df7734738d06a2800f3b6b0ee4fb4aa1"} Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.752960 4886 scope.go:117] "RemoveContainer" containerID="18dccc69ea12ffd53b4d4c8e312d9e5ee415348aafbce21b941019b15077a6b6" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.753524 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-zdbgk" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.792685 4886 scope.go:117] "RemoveContainer" containerID="8bfd8a8fe8f520c0bdd3a5164fe133a10f3e76f19d1c34103c42b1d9ab4fdfeb" Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.809059 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zdbgk"] Jan 29 17:10:06 crc kubenswrapper[4886]: I0129 17:10:06.820555 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zdbgk"] Jan 29 17:10:07 crc kubenswrapper[4886]: I0129 17:10:07.765108 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerStarted","Data":"266a8e9c96bb1b9fbb7a767f2b35ad40929d744419c9ebb7543402aacf3910b9"} Jan 29 17:10:07 crc kubenswrapper[4886]: I0129 17:10:07.765470 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:10:07 crc kubenswrapper[4886]: I0129 17:10:07.765472 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="ceilometer-central-agent" containerID="cri-o://91baaab9d9528ba788b818d15e20639e4d6e2fffc89317503dfc698ecdb0a06c" gracePeriod=30 Jan 29 17:10:07 crc kubenswrapper[4886]: I0129 17:10:07.765603 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="proxy-httpd" containerID="cri-o://266a8e9c96bb1b9fbb7a767f2b35ad40929d744419c9ebb7543402aacf3910b9" gracePeriod=30 Jan 29 17:10:07 crc kubenswrapper[4886]: I0129 17:10:07.765652 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="sg-core" containerID="cri-o://189370fd8336eb715dd7e8e4fbb1c1dcacac0f2820ddab52e349e5fc03b6bbea" gracePeriod=30 Jan 29 17:10:07 crc kubenswrapper[4886]: I0129 17:10:07.765693 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="ceilometer-notification-agent" containerID="cri-o://d4c2814ebfa5456f9a32d52477ed9133aa03f6c310e426d0feadc41c2659a8a9" gracePeriod=30 Jan 29 17:10:07 crc kubenswrapper[4886]: I0129 17:10:07.796462 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.380294784 podStartE2EDuration="9.796436499s" podCreationTimestamp="2026-01-29 17:09:58 +0000 UTC" firstStartedPulling="2026-01-29 17:09:59.534310502 +0000 UTC m=+2882.443029774" lastFinishedPulling="2026-01-29 17:10:06.950452217 +0000 UTC m=+2889.859171489" observedRunningTime="2026-01-29 17:10:07.78878708 +0000 UTC m=+2890.697506362" watchObservedRunningTime="2026-01-29 17:10:07.796436499 +0000 UTC m=+2890.705155771" Jan 29 17:10:08 crc kubenswrapper[4886]: I0129 17:10:08.629713 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" path="/var/lib/kubelet/pods/8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1/volumes" Jan 29 17:10:08 crc kubenswrapper[4886]: I0129 17:10:08.781471 4886 generic.go:334] "Generic (PLEG): container finished" podID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerID="266a8e9c96bb1b9fbb7a767f2b35ad40929d744419c9ebb7543402aacf3910b9" exitCode=0 Jan 29 17:10:08 crc kubenswrapper[4886]: I0129 17:10:08.781528 4886 generic.go:334] "Generic (PLEG): container finished" podID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerID="189370fd8336eb715dd7e8e4fbb1c1dcacac0f2820ddab52e349e5fc03b6bbea" exitCode=2 Jan 29 17:10:08 crc kubenswrapper[4886]: I0129 17:10:08.781542 4886 generic.go:334] "Generic (PLEG): container finished" podID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerID="d4c2814ebfa5456f9a32d52477ed9133aa03f6c310e426d0feadc41c2659a8a9" exitCode=0 Jan 29 17:10:08 crc kubenswrapper[4886]: I0129 17:10:08.781566 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerDied","Data":"266a8e9c96bb1b9fbb7a767f2b35ad40929d744419c9ebb7543402aacf3910b9"} Jan 29 17:10:08 crc kubenswrapper[4886]: I0129 17:10:08.781595 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerDied","Data":"189370fd8336eb715dd7e8e4fbb1c1dcacac0f2820ddab52e349e5fc03b6bbea"} Jan 29 17:10:08 crc kubenswrapper[4886]: I0129 17:10:08.781608 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerDied","Data":"d4c2814ebfa5456f9a32d52477ed9133aa03f6c310e426d0feadc41c2659a8a9"} Jan 29 17:10:09 crc kubenswrapper[4886]: I0129 17:10:09.801132 4886 generic.go:334] "Generic (PLEG): container finished" podID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerID="91baaab9d9528ba788b818d15e20639e4d6e2fffc89317503dfc698ecdb0a06c" exitCode=0 Jan 29 17:10:09 crc kubenswrapper[4886]: I0129 17:10:09.801177 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerDied","Data":"91baaab9d9528ba788b818d15e20639e4d6e2fffc89317503dfc698ecdb0a06c"} Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.294496 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.369494 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcsqj\" (UniqueName: \"kubernetes.io/projected/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-kube-api-access-fcsqj\") pod \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.369583 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-scripts\") pod \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.369679 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-config-data\") pod \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.369805 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-log-httpd\") pod \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.370302 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da502cd2-7a05-4d82-a90e-cfbd4069b0ac" (UID: "da502cd2-7a05-4d82-a90e-cfbd4069b0ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.370452 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-run-httpd\") pod \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.370678 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da502cd2-7a05-4d82-a90e-cfbd4069b0ac" (UID: "da502cd2-7a05-4d82-a90e-cfbd4069b0ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.370484 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-combined-ca-bundle\") pod \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.371121 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-sg-core-conf-yaml\") pod \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\" (UID: \"da502cd2-7a05-4d82-a90e-cfbd4069b0ac\") " Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.371948 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.371970 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.402506 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-scripts" (OuterVolumeSpecName: "scripts") pod "da502cd2-7a05-4d82-a90e-cfbd4069b0ac" (UID: "da502cd2-7a05-4d82-a90e-cfbd4069b0ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.402635 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-kube-api-access-fcsqj" (OuterVolumeSpecName: "kube-api-access-fcsqj") pod "da502cd2-7a05-4d82-a90e-cfbd4069b0ac" (UID: "da502cd2-7a05-4d82-a90e-cfbd4069b0ac"). InnerVolumeSpecName "kube-api-access-fcsqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.460056 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da502cd2-7a05-4d82-a90e-cfbd4069b0ac" (UID: "da502cd2-7a05-4d82-a90e-cfbd4069b0ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.473933 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.473960 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcsqj\" (UniqueName: \"kubernetes.io/projected/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-kube-api-access-fcsqj\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.473969 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.527081 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da502cd2-7a05-4d82-a90e-cfbd4069b0ac" (UID: "da502cd2-7a05-4d82-a90e-cfbd4069b0ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.538022 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-config-data" (OuterVolumeSpecName: "config-data") pod "da502cd2-7a05-4d82-a90e-cfbd4069b0ac" (UID: "da502cd2-7a05-4d82-a90e-cfbd4069b0ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.581432 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.581499 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da502cd2-7a05-4d82-a90e-cfbd4069b0ac-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.813191 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da502cd2-7a05-4d82-a90e-cfbd4069b0ac","Type":"ContainerDied","Data":"72d7fa6925704b9669a07a61d5a64685973e8bd1e0037e203f9d28200da940d5"} Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.813232 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.813243 4886 scope.go:117] "RemoveContainer" containerID="266a8e9c96bb1b9fbb7a767f2b35ad40929d744419c9ebb7543402aacf3910b9" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.815188 4886 generic.go:334] "Generic (PLEG): container finished" podID="7a1c51cd-f91d-406b-815c-00879a9d6401" containerID="5be86521758fe7c03f20fd8b758e10774f421701b95693128fa47b2a2e5adc70" exitCode=0 Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.815230 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ddfqz" event={"ID":"7a1c51cd-f91d-406b-815c-00879a9d6401","Type":"ContainerDied","Data":"5be86521758fe7c03f20fd8b758e10774f421701b95693128fa47b2a2e5adc70"} Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.881400 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.884589 4886 scope.go:117] "RemoveContainer" containerID="189370fd8336eb715dd7e8e4fbb1c1dcacac0f2820ddab52e349e5fc03b6bbea" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.895633 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.912940 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:10 crc kubenswrapper[4886]: E0129 17:10:10.913917 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="sg-core" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.913939 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="sg-core" Jan 29 17:10:10 crc kubenswrapper[4886]: E0129 17:10:10.913957 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="proxy-httpd" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.913963 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="proxy-httpd" Jan 29 17:10:10 crc kubenswrapper[4886]: E0129 17:10:10.914001 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" containerName="dnsmasq-dns" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914009 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" containerName="dnsmasq-dns" Jan 29 17:10:10 crc kubenswrapper[4886]: E0129 17:10:10.914018 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="ceilometer-central-agent" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914024 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="ceilometer-central-agent" Jan 29 17:10:10 crc kubenswrapper[4886]: E0129 17:10:10.914035 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" containerName="init" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914041 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" containerName="init" Jan 29 17:10:10 crc kubenswrapper[4886]: E0129 17:10:10.914055 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="ceilometer-notification-agent" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914062 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="ceilometer-notification-agent" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914283 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="ceilometer-central-agent" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914301 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="ceilometer-notification-agent" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914311 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="proxy-httpd" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914373 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccf7a7a-f65b-4942-9bfa-bc7a377e6ff1" containerName="dnsmasq-dns" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.914388 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" containerName="sg-core" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.917024 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.945362 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.945794 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.946070 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.946219 4886 scope.go:117] "RemoveContainer" containerID="d4c2814ebfa5456f9a32d52477ed9133aa03f6c310e426d0feadc41c2659a8a9" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.990698 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-run-httpd\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.990868 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.991018 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77p6n\" (UniqueName: \"kubernetes.io/projected/51203b48-4909-45b6-8c3a-296fc4ee639c-kube-api-access-77p6n\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.991268 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-scripts\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.991448 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.991542 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-log-httpd\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.991638 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-config-data\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:10 crc kubenswrapper[4886]: I0129 17:10:10.995911 4886 scope.go:117] "RemoveContainer" containerID="91baaab9d9528ba788b818d15e20639e4d6e2fffc89317503dfc698ecdb0a06c" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.093642 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-scripts\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.093754 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.093799 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-log-httpd\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.093839 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-config-data\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.093876 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-run-httpd\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.093926 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.093965 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77p6n\" (UniqueName: \"kubernetes.io/projected/51203b48-4909-45b6-8c3a-296fc4ee639c-kube-api-access-77p6n\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.095088 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-run-httpd\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.095394 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-log-httpd\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.098969 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-scripts\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.099209 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-config-data\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.104760 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.105546 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.128176 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77p6n\" (UniqueName: \"kubernetes.io/projected/51203b48-4909-45b6-8c3a-296fc4ee639c-kube-api-access-77p6n\") pod \"ceilometer-0\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.281943 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.769784 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:11 crc kubenswrapper[4886]: W0129 17:10:11.786504 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51203b48_4909_45b6_8c3a_296fc4ee639c.slice/crio-de5f49918f6704400cdc2de0d7791eff23d5b705cf50d627099de407ae90448b WatchSource:0}: Error finding container de5f49918f6704400cdc2de0d7791eff23d5b705cf50d627099de407ae90448b: Status 404 returned error can't find the container with id de5f49918f6704400cdc2de0d7791eff23d5b705cf50d627099de407ae90448b Jan 29 17:10:11 crc kubenswrapper[4886]: I0129 17:10:11.827987 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerStarted","Data":"de5f49918f6704400cdc2de0d7791eff23d5b705cf50d627099de407ae90448b"} Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.324223 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.423207 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-combined-ca-bundle\") pod \"7a1c51cd-f91d-406b-815c-00879a9d6401\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.423511 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-config-data\") pod \"7a1c51cd-f91d-406b-815c-00879a9d6401\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.423571 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-scripts\") pod \"7a1c51cd-f91d-406b-815c-00879a9d6401\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.423782 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwxx4\" (UniqueName: \"kubernetes.io/projected/7a1c51cd-f91d-406b-815c-00879a9d6401-kube-api-access-xwxx4\") pod \"7a1c51cd-f91d-406b-815c-00879a9d6401\" (UID: \"7a1c51cd-f91d-406b-815c-00879a9d6401\") " Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.427453 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1c51cd-f91d-406b-815c-00879a9d6401-kube-api-access-xwxx4" (OuterVolumeSpecName: "kube-api-access-xwxx4") pod "7a1c51cd-f91d-406b-815c-00879a9d6401" (UID: "7a1c51cd-f91d-406b-815c-00879a9d6401"). InnerVolumeSpecName "kube-api-access-xwxx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.430471 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-scripts" (OuterVolumeSpecName: "scripts") pod "7a1c51cd-f91d-406b-815c-00879a9d6401" (UID: "7a1c51cd-f91d-406b-815c-00879a9d6401"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.456426 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a1c51cd-f91d-406b-815c-00879a9d6401" (UID: "7a1c51cd-f91d-406b-815c-00879a9d6401"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.457573 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-config-data" (OuterVolumeSpecName: "config-data") pod "7a1c51cd-f91d-406b-815c-00879a9d6401" (UID: "7a1c51cd-f91d-406b-815c-00879a9d6401"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.526772 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwxx4\" (UniqueName: \"kubernetes.io/projected/7a1c51cd-f91d-406b-815c-00879a9d6401-kube-api-access-xwxx4\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.527001 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.527087 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.527147 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1c51cd-f91d-406b-815c-00879a9d6401-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.627248 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da502cd2-7a05-4d82-a90e-cfbd4069b0ac" path="/var/lib/kubelet/pods/da502cd2-7a05-4d82-a90e-cfbd4069b0ac/volumes" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.780841 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.780925 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.788924 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.790542 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.841443 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerStarted","Data":"c9c0e47c6badbee636eb54a74034a0d58d79d9a5f007d41423ec32b132adc41e"} Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.842592 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ddfqz" Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.842580 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ddfqz" event={"ID":"7a1c51cd-f91d-406b-815c-00879a9d6401","Type":"ContainerDied","Data":"f1662f2f91761a984c86477b3a390f7b3bd8f222aea924e68ce2bb82b98bbf96"} Jan 29 17:10:12 crc kubenswrapper[4886]: I0129 17:10:12.842781 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1662f2f91761a984c86477b3a390f7b3bd8f222aea924e68ce2bb82b98bbf96" Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.034756 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.035261 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerName="nova-api-log" containerID="cri-o://297512a17905e8884ba2dee2e1bd0e97f5fbde7e67ab2e041189401e3a8b1069" gracePeriod=30 Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.035495 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerName="nova-api-api" containerID="cri-o://5279babaff011b0a7c0724784680ba960a9fce4465f977efe275f3b290d89fab" gracePeriod=30 Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.048486 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.048695 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dd8b58c7-942f-4f89-88a0-ce374fd98f0b" containerName="nova-scheduler-scheduler" containerID="cri-o://9734db9b6c351c8b935d8796b19514bcaecf82f2265e11ccf340fb3e8e4c7834" gracePeriod=30 Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.092588 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.854950 4886 generic.go:334] "Generic (PLEG): container finished" podID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerID="5279babaff011b0a7c0724784680ba960a9fce4465f977efe275f3b290d89fab" exitCode=0 Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.855539 4886 generic.go:334] "Generic (PLEG): container finished" podID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerID="297512a17905e8884ba2dee2e1bd0e97f5fbde7e67ab2e041189401e3a8b1069" exitCode=143 Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.855021 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b515f59a-4b3a-4821-bbec-8e622a8164e6","Type":"ContainerDied","Data":"5279babaff011b0a7c0724784680ba960a9fce4465f977efe275f3b290d89fab"} Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.855669 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b515f59a-4b3a-4821-bbec-8e622a8164e6","Type":"ContainerDied","Data":"297512a17905e8884ba2dee2e1bd0e97f5fbde7e67ab2e041189401e3a8b1069"} Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.855695 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b515f59a-4b3a-4821-bbec-8e622a8164e6","Type":"ContainerDied","Data":"3b5aab9a83beedb9411f1928c81b699649b72f9a5c36a34dc864ad27dbc02c85"} Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.855707 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b5aab9a83beedb9411f1928c81b699649b72f9a5c36a34dc864ad27dbc02c85" Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.862252 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerStarted","Data":"af32cb3d4cad94fb3c21ee16283db0307dd6a80318541f4accfe0f6d97cb6b84"} Jan 29 17:10:13 crc kubenswrapper[4886]: I0129 17:10:13.958622 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.081902 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b515f59a-4b3a-4821-bbec-8e622a8164e6-logs\") pod \"b515f59a-4b3a-4821-bbec-8e622a8164e6\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.082032 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-internal-tls-certs\") pod \"b515f59a-4b3a-4821-bbec-8e622a8164e6\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.082063 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-combined-ca-bundle\") pod \"b515f59a-4b3a-4821-bbec-8e622a8164e6\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.082080 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-public-tls-certs\") pod \"b515f59a-4b3a-4821-bbec-8e622a8164e6\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.082123 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-config-data\") pod \"b515f59a-4b3a-4821-bbec-8e622a8164e6\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.082153 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r59zt\" (UniqueName: \"kubernetes.io/projected/b515f59a-4b3a-4821-bbec-8e622a8164e6-kube-api-access-r59zt\") pod \"b515f59a-4b3a-4821-bbec-8e622a8164e6\" (UID: \"b515f59a-4b3a-4821-bbec-8e622a8164e6\") " Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.082941 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b515f59a-4b3a-4821-bbec-8e622a8164e6-logs" (OuterVolumeSpecName: "logs") pod "b515f59a-4b3a-4821-bbec-8e622a8164e6" (UID: "b515f59a-4b3a-4821-bbec-8e622a8164e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.102824 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b515f59a-4b3a-4821-bbec-8e622a8164e6-kube-api-access-r59zt" (OuterVolumeSpecName: "kube-api-access-r59zt") pod "b515f59a-4b3a-4821-bbec-8e622a8164e6" (UID: "b515f59a-4b3a-4821-bbec-8e622a8164e6"). InnerVolumeSpecName "kube-api-access-r59zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.131508 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-config-data" (OuterVolumeSpecName: "config-data") pod "b515f59a-4b3a-4821-bbec-8e622a8164e6" (UID: "b515f59a-4b3a-4821-bbec-8e622a8164e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.160691 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b515f59a-4b3a-4821-bbec-8e622a8164e6" (UID: "b515f59a-4b3a-4821-bbec-8e622a8164e6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.161007 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b515f59a-4b3a-4821-bbec-8e622a8164e6" (UID: "b515f59a-4b3a-4821-bbec-8e622a8164e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.185127 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.185169 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r59zt\" (UniqueName: \"kubernetes.io/projected/b515f59a-4b3a-4821-bbec-8e622a8164e6-kube-api-access-r59zt\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.185181 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b515f59a-4b3a-4821-bbec-8e622a8164e6-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.185190 4886 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.185198 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.208496 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b515f59a-4b3a-4821-bbec-8e622a8164e6" (UID: "b515f59a-4b3a-4821-bbec-8e622a8164e6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.295495 4886 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b515f59a-4b3a-4821-bbec-8e622a8164e6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.874726 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerStarted","Data":"6c975034f363da994f8f028b9f44a46d5e4b43e5df94d066fa0723bd5320a3f5"} Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.874848 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.875077 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-metadata" containerID="cri-o://cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9" gracePeriod=30 Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.875031 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-log" containerID="cri-o://5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc" gracePeriod=30 Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.925090 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.937631 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.957381 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:14 crc kubenswrapper[4886]: E0129 17:10:14.957924 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerName="nova-api-log" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.957944 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerName="nova-api-log" Jan 29 17:10:14 crc kubenswrapper[4886]: E0129 17:10:14.957957 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerName="nova-api-api" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.957965 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerName="nova-api-api" Jan 29 17:10:14 crc kubenswrapper[4886]: E0129 17:10:14.958001 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c51cd-f91d-406b-815c-00879a9d6401" containerName="nova-manage" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.958007 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c51cd-f91d-406b-815c-00879a9d6401" containerName="nova-manage" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.958219 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerName="nova-api-log" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.958242 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" containerName="nova-api-api" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.958260 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c51cd-f91d-406b-815c-00879a9d6401" containerName="nova-manage" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.959613 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.961571 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.961676 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.961714 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 29 17:10:14 crc kubenswrapper[4886]: I0129 17:10:14.989486 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.010539 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbffe358-e916-4693-b76d-09fd332a7082-logs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.010600 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.010649 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpk7z\" (UniqueName: \"kubernetes.io/projected/cbffe358-e916-4693-b76d-09fd332a7082-kube-api-access-fpk7z\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.010690 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-config-data\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.010809 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-public-tls-certs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.010831 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.107891 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b7z4z" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="registry-server" probeResult="failure" output=< Jan 29 17:10:15 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:10:15 crc kubenswrapper[4886]: > Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.112863 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.112938 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpk7z\" (UniqueName: \"kubernetes.io/projected/cbffe358-e916-4693-b76d-09fd332a7082-kube-api-access-fpk7z\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.112977 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-config-data\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.113060 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-public-tls-certs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.113080 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.113169 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbffe358-e916-4693-b76d-09fd332a7082-logs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.113569 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbffe358-e916-4693-b76d-09fd332a7082-logs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.119041 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-config-data\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.119511 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.120854 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.127926 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbffe358-e916-4693-b76d-09fd332a7082-public-tls-certs\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.136176 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpk7z\" (UniqueName: \"kubernetes.io/projected/cbffe358-e916-4693-b76d-09fd332a7082-kube-api-access-fpk7z\") pod \"nova-api-0\" (UID: \"cbffe358-e916-4693-b76d-09fd332a7082\") " pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.279031 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.803344 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.888557 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbffe358-e916-4693-b76d-09fd332a7082","Type":"ContainerStarted","Data":"e88fb79a196e941fabd58fb768bad1edc1da992688c9b33a1a1e6122f1242cb4"} Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.890066 4886 generic.go:334] "Generic (PLEG): container finished" podID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerID="5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc" exitCode=143 Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.890111 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba13f7f-cb9d-4147-9f9d-982bd5daac77","Type":"ContainerDied","Data":"5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc"} Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.891295 4886 generic.go:334] "Generic (PLEG): container finished" podID="dd8b58c7-942f-4f89-88a0-ce374fd98f0b" containerID="9734db9b6c351c8b935d8796b19514bcaecf82f2265e11ccf340fb3e8e4c7834" exitCode=0 Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.891318 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd8b58c7-942f-4f89-88a0-ce374fd98f0b","Type":"ContainerDied","Data":"9734db9b6c351c8b935d8796b19514bcaecf82f2265e11ccf340fb3e8e4c7834"} Jan 29 17:10:15 crc kubenswrapper[4886]: I0129 17:10:15.990245 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.143622 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-combined-ca-bundle\") pod \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.143736 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-config-data\") pod \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.143923 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wgk9\" (UniqueName: \"kubernetes.io/projected/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-kube-api-access-4wgk9\") pod \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\" (UID: \"dd8b58c7-942f-4f89-88a0-ce374fd98f0b\") " Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.148582 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-kube-api-access-4wgk9" (OuterVolumeSpecName: "kube-api-access-4wgk9") pod "dd8b58c7-942f-4f89-88a0-ce374fd98f0b" (UID: "dd8b58c7-942f-4f89-88a0-ce374fd98f0b"). InnerVolumeSpecName "kube-api-access-4wgk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.183421 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-config-data" (OuterVolumeSpecName: "config-data") pod "dd8b58c7-942f-4f89-88a0-ce374fd98f0b" (UID: "dd8b58c7-942f-4f89-88a0-ce374fd98f0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.199079 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd8b58c7-942f-4f89-88a0-ce374fd98f0b" (UID: "dd8b58c7-942f-4f89-88a0-ce374fd98f0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.246989 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wgk9\" (UniqueName: \"kubernetes.io/projected/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-kube-api-access-4wgk9\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.247021 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.247031 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8b58c7-942f-4f89-88a0-ce374fd98f0b-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.643618 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b515f59a-4b3a-4821-bbec-8e622a8164e6" path="/var/lib/kubelet/pods/b515f59a-4b3a-4821-bbec-8e622a8164e6/volumes" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.912123 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd8b58c7-942f-4f89-88a0-ce374fd98f0b","Type":"ContainerDied","Data":"c2ea7d41eadeb9e0900ac95c53b4acc74be8017115cf4e43325000be7c90063b"} Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.912175 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.912196 4886 scope.go:117] "RemoveContainer" containerID="9734db9b6c351c8b935d8796b19514bcaecf82f2265e11ccf340fb3e8e4c7834" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.918770 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbffe358-e916-4693-b76d-09fd332a7082","Type":"ContainerStarted","Data":"f74f4068a780ec3e97c028d30192a5c360c29d9e96ab00f973a4915e0a4ec0b6"} Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.918805 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbffe358-e916-4693-b76d-09fd332a7082","Type":"ContainerStarted","Data":"c93af99841322471d4d39b5b6ef50088a4e01be653dbd6536ee4b3e2038de5e2"} Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.926643 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerStarted","Data":"01c6694fd4df1d797b97e25cbe9f80e6eca4f580fbbf77224f8cc99225251a03"} Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.927660 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.962131 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.962106245 podStartE2EDuration="2.962106245s" podCreationTimestamp="2026-01-29 17:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:10:16.943551324 +0000 UTC m=+2899.852270596" watchObservedRunningTime="2026-01-29 17:10:16.962106245 +0000 UTC m=+2899.870825517" Jan 29 17:10:16 crc kubenswrapper[4886]: I0129 17:10:16.975625 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.398555709 podStartE2EDuration="6.975605401s" podCreationTimestamp="2026-01-29 17:10:10 +0000 UTC" firstStartedPulling="2026-01-29 17:10:11.797825743 +0000 UTC m=+2894.706545015" lastFinishedPulling="2026-01-29 17:10:16.374875435 +0000 UTC m=+2899.283594707" observedRunningTime="2026-01-29 17:10:16.958720568 +0000 UTC m=+2899.867439830" watchObservedRunningTime="2026-01-29 17:10:16.975605401 +0000 UTC m=+2899.884324683" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.008125 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.021969 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.032273 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:10:17 crc kubenswrapper[4886]: E0129 17:10:17.032842 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8b58c7-942f-4f89-88a0-ce374fd98f0b" containerName="nova-scheduler-scheduler" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.032864 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8b58c7-942f-4f89-88a0-ce374fd98f0b" containerName="nova-scheduler-scheduler" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.033060 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8b58c7-942f-4f89-88a0-ce374fd98f0b" containerName="nova-scheduler-scheduler" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.033885 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.036749 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.042000 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.172460 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc4c563c-21d3-41cf-aabf-dd4429d59b62-config-data\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.172554 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4c563c-21d3-41cf-aabf-dd4429d59b62-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.172755 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbfhj\" (UniqueName: \"kubernetes.io/projected/fc4c563c-21d3-41cf-aabf-dd4429d59b62-kube-api-access-bbfhj\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.274555 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc4c563c-21d3-41cf-aabf-dd4429d59b62-config-data\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.274626 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4c563c-21d3-41cf-aabf-dd4429d59b62-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.274766 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbfhj\" (UniqueName: \"kubernetes.io/projected/fc4c563c-21d3-41cf-aabf-dd4429d59b62-kube-api-access-bbfhj\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.280443 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4c563c-21d3-41cf-aabf-dd4429d59b62-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.283021 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc4c563c-21d3-41cf-aabf-dd4429d59b62-config-data\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.299524 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbfhj\" (UniqueName: \"kubernetes.io/projected/fc4c563c-21d3-41cf-aabf-dd4429d59b62-kube-api-access-bbfhj\") pod \"nova-scheduler-0\" (UID: \"fc4c563c-21d3-41cf-aabf-dd4429d59b62\") " pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.359946 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.937735 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 17:10:17 crc kubenswrapper[4886]: I0129 17:10:17.951598 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc4c563c-21d3-41cf-aabf-dd4429d59b62","Type":"ContainerStarted","Data":"b39c129d992b6913ea3b322e36d56792fb7f27e379c2c13f26ce269ac248fa3f"} Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.016488 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.10:8775/\": read tcp 10.217.0.2:53798->10.217.1.10:8775: read: connection reset by peer" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.016619 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.10:8775/\": read tcp 10.217.0.2:53814->10.217.1.10:8775: read: connection reset by peer" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.533012 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.605715 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-combined-ca-bundle\") pod \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.605897 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-config-data\") pod \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.605984 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-nova-metadata-tls-certs\") pod \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.606083 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-logs\") pod \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.606135 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr8p2\" (UniqueName: \"kubernetes.io/projected/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-kube-api-access-dr8p2\") pod \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\" (UID: \"6ba13f7f-cb9d-4147-9f9d-982bd5daac77\") " Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.606853 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-logs" (OuterVolumeSpecName: "logs") pod "6ba13f7f-cb9d-4147-9f9d-982bd5daac77" (UID: "6ba13f7f-cb9d-4147-9f9d-982bd5daac77"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.616414 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-kube-api-access-dr8p2" (OuterVolumeSpecName: "kube-api-access-dr8p2") pod "6ba13f7f-cb9d-4147-9f9d-982bd5daac77" (UID: "6ba13f7f-cb9d-4147-9f9d-982bd5daac77"). InnerVolumeSpecName "kube-api-access-dr8p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.648250 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8b58c7-942f-4f89-88a0-ce374fd98f0b" path="/var/lib/kubelet/pods/dd8b58c7-942f-4f89-88a0-ce374fd98f0b/volumes" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.671798 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-config-data" (OuterVolumeSpecName: "config-data") pod "6ba13f7f-cb9d-4147-9f9d-982bd5daac77" (UID: "6ba13f7f-cb9d-4147-9f9d-982bd5daac77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.685230 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ba13f7f-cb9d-4147-9f9d-982bd5daac77" (UID: "6ba13f7f-cb9d-4147-9f9d-982bd5daac77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.706799 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6ba13f7f-cb9d-4147-9f9d-982bd5daac77" (UID: "6ba13f7f-cb9d-4147-9f9d-982bd5daac77"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.709746 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.709794 4886 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.709808 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-logs\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.709819 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr8p2\" (UniqueName: \"kubernetes.io/projected/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-kube-api-access-dr8p2\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.709830 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba13f7f-cb9d-4147-9f9d-982bd5daac77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.987440 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc4c563c-21d3-41cf-aabf-dd4429d59b62","Type":"ContainerStarted","Data":"5a7fd94ae03c209702afc1ec138d28e079580d82a66e66b5e311b5a921afa695"} Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.990456 4886 generic.go:334] "Generic (PLEG): container finished" podID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerID="cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9" exitCode=0 Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.990486 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba13f7f-cb9d-4147-9f9d-982bd5daac77","Type":"ContainerDied","Data":"cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9"} Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.990509 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba13f7f-cb9d-4147-9f9d-982bd5daac77","Type":"ContainerDied","Data":"590686b9473f5c18e61b69cef7feee9a7b36c136560c55bdbbed141a70bc112d"} Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.990525 4886 scope.go:117] "RemoveContainer" containerID="cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9" Jan 29 17:10:18 crc kubenswrapper[4886]: I0129 17:10:18.990606 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.005408 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.005393719 podStartE2EDuration="3.005393719s" podCreationTimestamp="2026-01-29 17:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:10:19.004257097 +0000 UTC m=+2901.912976379" watchObservedRunningTime="2026-01-29 17:10:19.005393719 +0000 UTC m=+2901.914112991" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.042091 4886 scope.go:117] "RemoveContainer" containerID="5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.054586 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.083859 4886 scope.go:117] "RemoveContainer" containerID="cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9" Jan 29 17:10:19 crc kubenswrapper[4886]: E0129 17:10:19.084501 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9\": container with ID starting with cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9 not found: ID does not exist" containerID="cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.084567 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9"} err="failed to get container status \"cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9\": rpc error: code = NotFound desc = could not find container \"cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9\": container with ID starting with cd779590c513b85f1be24ee1be77a1addf20dbbca3b8eb0c655a6287c5d23cb9 not found: ID does not exist" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.084593 4886 scope.go:117] "RemoveContainer" containerID="5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc" Jan 29 17:10:19 crc kubenswrapper[4886]: E0129 17:10:19.084932 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc\": container with ID starting with 5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc not found: ID does not exist" containerID="5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.084951 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc"} err="failed to get container status \"5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc\": rpc error: code = NotFound desc = could not find container \"5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc\": container with ID starting with 5b523a0231e956d5db224e5c8db2f3e8aaf553d5abc7de07ad05e39c231cc3fc not found: ID does not exist" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.105563 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.136394 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:10:19 crc kubenswrapper[4886]: E0129 17:10:19.136970 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-log" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.136990 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-log" Jan 29 17:10:19 crc kubenswrapper[4886]: E0129 17:10:19.137013 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-metadata" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.137019 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-metadata" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.137259 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-log" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.137274 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" containerName="nova-metadata-metadata" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.138787 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.141602 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.144773 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.154455 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.242217 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.242287 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkw6\" (UniqueName: \"kubernetes.io/projected/9a568175-84cc-425a-9adf-5013a7fb5171-kube-api-access-fqkw6\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.242740 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-config-data\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.243123 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a568175-84cc-425a-9adf-5013a7fb5171-logs\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.243279 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.345528 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-config-data\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.345674 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a568175-84cc-425a-9adf-5013a7fb5171-logs\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.345733 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.345764 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.345789 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkw6\" (UniqueName: \"kubernetes.io/projected/9a568175-84cc-425a-9adf-5013a7fb5171-kube-api-access-fqkw6\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.347055 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a568175-84cc-425a-9adf-5013a7fb5171-logs\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.354836 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.358805 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.359320 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a568175-84cc-425a-9adf-5013a7fb5171-config-data\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.367152 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkw6\" (UniqueName: \"kubernetes.io/projected/9a568175-84cc-425a-9adf-5013a7fb5171-kube-api-access-fqkw6\") pod \"nova-metadata-0\" (UID: \"9a568175-84cc-425a-9adf-5013a7fb5171\") " pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.472727 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 17:10:19 crc kubenswrapper[4886]: I0129 17:10:19.983856 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 17:10:19 crc kubenswrapper[4886]: W0129 17:10:19.987082 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a568175_84cc_425a_9adf_5013a7fb5171.slice/crio-c6aa4f4fb07bcd94dc23abe25c30aa14a5a66175f93a187f3b25d31820287687 WatchSource:0}: Error finding container c6aa4f4fb07bcd94dc23abe25c30aa14a5a66175f93a187f3b25d31820287687: Status 404 returned error can't find the container with id c6aa4f4fb07bcd94dc23abe25c30aa14a5a66175f93a187f3b25d31820287687 Jan 29 17:10:20 crc kubenswrapper[4886]: I0129 17:10:20.006612 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a568175-84cc-425a-9adf-5013a7fb5171","Type":"ContainerStarted","Data":"c6aa4f4fb07bcd94dc23abe25c30aa14a5a66175f93a187f3b25d31820287687"} Jan 29 17:10:20 crc kubenswrapper[4886]: I0129 17:10:20.633498 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba13f7f-cb9d-4147-9f9d-982bd5daac77" path="/var/lib/kubelet/pods/6ba13f7f-cb9d-4147-9f9d-982bd5daac77/volumes" Jan 29 17:10:21 crc kubenswrapper[4886]: I0129 17:10:21.024246 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a568175-84cc-425a-9adf-5013a7fb5171","Type":"ContainerStarted","Data":"8acefa6d1c42b715e1b3c36b2826a0a57ac2cf2b1a2590a3dff7b817d637c904"} Jan 29 17:10:21 crc kubenswrapper[4886]: I0129 17:10:21.024326 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a568175-84cc-425a-9adf-5013a7fb5171","Type":"ContainerStarted","Data":"b9fdf8d8e8bdb1ef81e9be52cdb85659ac35f6333566df9a59096924dc10bd8f"} Jan 29 17:10:21 crc kubenswrapper[4886]: I0129 17:10:21.050503 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.050483617 podStartE2EDuration="2.050483617s" podCreationTimestamp="2026-01-29 17:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:10:21.045177795 +0000 UTC m=+2903.953897127" watchObservedRunningTime="2026-01-29 17:10:21.050483617 +0000 UTC m=+2903.959202909" Jan 29 17:10:22 crc kubenswrapper[4886]: I0129 17:10:22.360094 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 17:10:24 crc kubenswrapper[4886]: I0129 17:10:24.113733 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:10:24 crc kubenswrapper[4886]: I0129 17:10:24.163534 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:10:24 crc kubenswrapper[4886]: I0129 17:10:24.473784 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 17:10:24 crc kubenswrapper[4886]: I0129 17:10:24.473879 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 17:10:24 crc kubenswrapper[4886]: I0129 17:10:24.879109 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7z4z"] Jan 29 17:10:25 crc kubenswrapper[4886]: I0129 17:10:25.279776 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 17:10:25 crc kubenswrapper[4886]: I0129 17:10:25.279840 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 17:10:26 crc kubenswrapper[4886]: I0129 17:10:26.088629 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b7z4z" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="registry-server" containerID="cri-o://4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede" gracePeriod=2 Jan 29 17:10:26 crc kubenswrapper[4886]: I0129 17:10:26.293704 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cbffe358-e916-4693-b76d-09fd332a7082" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.17:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 17:10:26 crc kubenswrapper[4886]: I0129 17:10:26.293706 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cbffe358-e916-4693-b76d-09fd332a7082" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.17:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 17:10:26 crc kubenswrapper[4886]: I0129 17:10:26.892890 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.043320 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-catalog-content\") pod \"265d5adc-ace5-4008-99d5-206b5182e6d4\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.044011 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkxvc\" (UniqueName: \"kubernetes.io/projected/265d5adc-ace5-4008-99d5-206b5182e6d4-kube-api-access-xkxvc\") pod \"265d5adc-ace5-4008-99d5-206b5182e6d4\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.044233 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-utilities\") pod \"265d5adc-ace5-4008-99d5-206b5182e6d4\" (UID: \"265d5adc-ace5-4008-99d5-206b5182e6d4\") " Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.044778 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-utilities" (OuterVolumeSpecName: "utilities") pod "265d5adc-ace5-4008-99d5-206b5182e6d4" (UID: "265d5adc-ace5-4008-99d5-206b5182e6d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.045433 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.054113 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265d5adc-ace5-4008-99d5-206b5182e6d4-kube-api-access-xkxvc" (OuterVolumeSpecName: "kube-api-access-xkxvc") pod "265d5adc-ace5-4008-99d5-206b5182e6d4" (UID: "265d5adc-ace5-4008-99d5-206b5182e6d4"). InnerVolumeSpecName "kube-api-access-xkxvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.107072 4886 generic.go:334] "Generic (PLEG): container finished" podID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerID="4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede" exitCode=0 Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.107113 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7z4z" event={"ID":"265d5adc-ace5-4008-99d5-206b5182e6d4","Type":"ContainerDied","Data":"4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede"} Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.107140 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7z4z" event={"ID":"265d5adc-ace5-4008-99d5-206b5182e6d4","Type":"ContainerDied","Data":"b49a773367da81a381e19a2ba4ecf2f2565cbe6beacc718a457751390e647a71"} Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.107157 4886 scope.go:117] "RemoveContainer" containerID="4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.107297 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7z4z" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.120746 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "265d5adc-ace5-4008-99d5-206b5182e6d4" (UID: "265d5adc-ace5-4008-99d5-206b5182e6d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.127297 4886 scope.go:117] "RemoveContainer" containerID="3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.147263 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265d5adc-ace5-4008-99d5-206b5182e6d4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.147314 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkxvc\" (UniqueName: \"kubernetes.io/projected/265d5adc-ace5-4008-99d5-206b5182e6d4-kube-api-access-xkxvc\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.155840 4886 scope.go:117] "RemoveContainer" containerID="c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.206110 4886 scope.go:117] "RemoveContainer" containerID="4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede" Jan 29 17:10:27 crc kubenswrapper[4886]: E0129 17:10:27.206552 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede\": container with ID starting with 4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede not found: ID does not exist" containerID="4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.206587 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede"} err="failed to get container status \"4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede\": rpc error: code = NotFound desc = could not find container \"4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede\": container with ID starting with 4f918436d3a4458be4f1385c7fcfd7781d59051384022442109a970fd2117ede not found: ID does not exist" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.206609 4886 scope.go:117] "RemoveContainer" containerID="3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173" Jan 29 17:10:27 crc kubenswrapper[4886]: E0129 17:10:27.206823 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173\": container with ID starting with 3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173 not found: ID does not exist" containerID="3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.206848 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173"} err="failed to get container status \"3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173\": rpc error: code = NotFound desc = could not find container \"3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173\": container with ID starting with 3348e603d16bdd075d9fa10e25af3a479e537e3ba1e85926303e7efb2d68b173 not found: ID does not exist" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.206863 4886 scope.go:117] "RemoveContainer" containerID="c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368" Jan 29 17:10:27 crc kubenswrapper[4886]: E0129 17:10:27.207219 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368\": container with ID starting with c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368 not found: ID does not exist" containerID="c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.207244 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368"} err="failed to get container status \"c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368\": rpc error: code = NotFound desc = could not find container \"c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368\": container with ID starting with c1dd6ae46daebf75b61de05db1d9dcf57ca090cd74e3c93bdef7a80a5b1e0368 not found: ID does not exist" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.360709 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.424290 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.484114 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7z4z"] Jan 29 17:10:27 crc kubenswrapper[4886]: I0129 17:10:27.495787 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b7z4z"] Jan 29 17:10:28 crc kubenswrapper[4886]: I0129 17:10:28.156488 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 17:10:28 crc kubenswrapper[4886]: I0129 17:10:28.647168 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" path="/var/lib/kubelet/pods/265d5adc-ace5-4008-99d5-206b5182e6d4/volumes" Jan 29 17:10:29 crc kubenswrapper[4886]: I0129 17:10:29.472993 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 17:10:29 crc kubenswrapper[4886]: I0129 17:10:29.473265 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 17:10:30 crc kubenswrapper[4886]: I0129 17:10:30.489503 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9a568175-84cc-425a-9adf-5013a7fb5171" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.19:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 17:10:30 crc kubenswrapper[4886]: I0129 17:10:30.489526 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9a568175-84cc-425a-9adf-5013a7fb5171" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.19:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 17:10:35 crc kubenswrapper[4886]: I0129 17:10:35.286811 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 17:10:35 crc kubenswrapper[4886]: I0129 17:10:35.287636 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 17:10:35 crc kubenswrapper[4886]: I0129 17:10:35.288549 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 17:10:35 crc kubenswrapper[4886]: I0129 17:10:35.297615 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.043088 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-84jbh"] Jan 29 17:10:36 crc kubenswrapper[4886]: E0129 17:10:36.043932 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="registry-server" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.043949 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="registry-server" Jan 29 17:10:36 crc kubenswrapper[4886]: E0129 17:10:36.043970 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="extract-utilities" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.043980 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="extract-utilities" Jan 29 17:10:36 crc kubenswrapper[4886]: E0129 17:10:36.043992 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="extract-content" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.044000 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="extract-content" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.044269 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="265d5adc-ace5-4008-99d5-206b5182e6d4" containerName="registry-server" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.048109 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.058313 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-84jbh"] Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.178227 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-catalog-content\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.178497 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-utilities\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.178638 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96q7f\" (UniqueName: \"kubernetes.io/projected/217e65b9-b1b5-4244-930b-b85bc2e0a948-kube-api-access-96q7f\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.209271 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.215839 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.280611 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-utilities\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.280684 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96q7f\" (UniqueName: \"kubernetes.io/projected/217e65b9-b1b5-4244-930b-b85bc2e0a948-kube-api-access-96q7f\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.280897 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-catalog-content\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.281200 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-utilities\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.281347 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-catalog-content\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.306363 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96q7f\" (UniqueName: \"kubernetes.io/projected/217e65b9-b1b5-4244-930b-b85bc2e0a948-kube-api-access-96q7f\") pod \"redhat-marketplace-84jbh\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.393555 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:36 crc kubenswrapper[4886]: I0129 17:10:36.925620 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-84jbh"] Jan 29 17:10:36 crc kubenswrapper[4886]: W0129 17:10:36.932829 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod217e65b9_b1b5_4244_930b_b85bc2e0a948.slice/crio-de22974c151bb71b46851f3f7e77eea61bb7ffc33602315145dbd816afde3589 WatchSource:0}: Error finding container de22974c151bb71b46851f3f7e77eea61bb7ffc33602315145dbd816afde3589: Status 404 returned error can't find the container with id de22974c151bb71b46851f3f7e77eea61bb7ffc33602315145dbd816afde3589 Jan 29 17:10:37 crc kubenswrapper[4886]: I0129 17:10:37.219548 4886 generic.go:334] "Generic (PLEG): container finished" podID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerID="44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707" exitCode=0 Jan 29 17:10:37 crc kubenswrapper[4886]: I0129 17:10:37.221666 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84jbh" event={"ID":"217e65b9-b1b5-4244-930b-b85bc2e0a948","Type":"ContainerDied","Data":"44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707"} Jan 29 17:10:37 crc kubenswrapper[4886]: I0129 17:10:37.221743 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84jbh" event={"ID":"217e65b9-b1b5-4244-930b-b85bc2e0a948","Type":"ContainerStarted","Data":"de22974c151bb71b46851f3f7e77eea61bb7ffc33602315145dbd816afde3589"} Jan 29 17:10:38 crc kubenswrapper[4886]: I0129 17:10:38.238184 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84jbh" event={"ID":"217e65b9-b1b5-4244-930b-b85bc2e0a948","Type":"ContainerStarted","Data":"a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146"} Jan 29 17:10:39 crc kubenswrapper[4886]: I0129 17:10:39.479665 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 17:10:39 crc kubenswrapper[4886]: I0129 17:10:39.481709 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 17:10:39 crc kubenswrapper[4886]: I0129 17:10:39.484431 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 17:10:40 crc kubenswrapper[4886]: I0129 17:10:40.255217 4886 generic.go:334] "Generic (PLEG): container finished" podID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerID="a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146" exitCode=0 Jan 29 17:10:40 crc kubenswrapper[4886]: I0129 17:10:40.255319 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84jbh" event={"ID":"217e65b9-b1b5-4244-930b-b85bc2e0a948","Type":"ContainerDied","Data":"a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146"} Jan 29 17:10:40 crc kubenswrapper[4886]: I0129 17:10:40.270040 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 17:10:41 crc kubenswrapper[4886]: I0129 17:10:41.270003 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84jbh" event={"ID":"217e65b9-b1b5-4244-930b-b85bc2e0a948","Type":"ContainerStarted","Data":"1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1"} Jan 29 17:10:41 crc kubenswrapper[4886]: I0129 17:10:41.311818 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 17:10:41 crc kubenswrapper[4886]: I0129 17:10:41.322801 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-84jbh" podStartSLOduration=1.877611385 podStartE2EDuration="5.322777416s" podCreationTimestamp="2026-01-29 17:10:36 +0000 UTC" firstStartedPulling="2026-01-29 17:10:37.222067491 +0000 UTC m=+2920.130786763" lastFinishedPulling="2026-01-29 17:10:40.667233522 +0000 UTC m=+2923.575952794" observedRunningTime="2026-01-29 17:10:41.289625187 +0000 UTC m=+2924.198344469" watchObservedRunningTime="2026-01-29 17:10:41.322777416 +0000 UTC m=+2924.231496698" Jan 29 17:10:45 crc kubenswrapper[4886]: I0129 17:10:45.358953 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:10:45 crc kubenswrapper[4886]: I0129 17:10:45.359774 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="dba0c99a-0f14-42bd-8822-ee79fc73ee41" containerName="kube-state-metrics" containerID="cri-o://27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2" gracePeriod=30 Jan 29 17:10:45 crc kubenswrapper[4886]: I0129 17:10:45.499353 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:10:45 crc kubenswrapper[4886]: I0129 17:10:45.500177 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" containerName="mysqld-exporter" containerID="cri-o://2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66" gracePeriod=30 Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.207738 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.214760 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.330083 4886 generic.go:334] "Generic (PLEG): container finished" podID="f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" containerID="2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66" exitCode=2 Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.330140 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.330153 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e","Type":"ContainerDied","Data":"2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66"} Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.330182 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e","Type":"ContainerDied","Data":"a4b442eb660a759ea9b06148625ca4e079373c7e47cea96d0478208100ae22a9"} Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.330201 4886 scope.go:117] "RemoveContainer" containerID="2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.333165 4886 generic.go:334] "Generic (PLEG): container finished" podID="dba0c99a-0f14-42bd-8822-ee79fc73ee41" containerID="27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2" exitCode=2 Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.333198 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dba0c99a-0f14-42bd-8822-ee79fc73ee41","Type":"ContainerDied","Data":"27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2"} Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.333203 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.333221 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dba0c99a-0f14-42bd-8822-ee79fc73ee41","Type":"ContainerDied","Data":"e23683912c13c24ac6376c0e92dd23177282cc9bf4441644e7ddbf8a433b486b"} Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.370446 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrp8r\" (UniqueName: \"kubernetes.io/projected/dba0c99a-0f14-42bd-8822-ee79fc73ee41-kube-api-access-xrp8r\") pod \"dba0c99a-0f14-42bd-8822-ee79fc73ee41\" (UID: \"dba0c99a-0f14-42bd-8822-ee79fc73ee41\") " Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.370704 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-config-data\") pod \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.370905 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-combined-ca-bundle\") pod \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.370926 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v52w\" (UniqueName: \"kubernetes.io/projected/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-kube-api-access-5v52w\") pod \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\" (UID: \"f0d54f6d-4531-4707-8c1a-aed5e0e36d0e\") " Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.372487 4886 scope.go:117] "RemoveContainer" containerID="2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66" Jan 29 17:10:46 crc kubenswrapper[4886]: E0129 17:10:46.373137 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66\": container with ID starting with 2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66 not found: ID does not exist" containerID="2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.373186 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66"} err="failed to get container status \"2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66\": rpc error: code = NotFound desc = could not find container \"2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66\": container with ID starting with 2df9bc2e05bc1630cc3e5fb6a640fa85bdf65d2d98be5d0f01536073ed245e66 not found: ID does not exist" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.373210 4886 scope.go:117] "RemoveContainer" containerID="27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.378169 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-kube-api-access-5v52w" (OuterVolumeSpecName: "kube-api-access-5v52w") pod "f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" (UID: "f0d54f6d-4531-4707-8c1a-aed5e0e36d0e"). InnerVolumeSpecName "kube-api-access-5v52w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.383455 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dba0c99a-0f14-42bd-8822-ee79fc73ee41-kube-api-access-xrp8r" (OuterVolumeSpecName: "kube-api-access-xrp8r") pod "dba0c99a-0f14-42bd-8822-ee79fc73ee41" (UID: "dba0c99a-0f14-42bd-8822-ee79fc73ee41"). InnerVolumeSpecName "kube-api-access-xrp8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.395289 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.395586 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.406376 4886 scope.go:117] "RemoveContainer" containerID="27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2" Jan 29 17:10:46 crc kubenswrapper[4886]: E0129 17:10:46.406835 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2\": container with ID starting with 27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2 not found: ID does not exist" containerID="27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.406872 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2"} err="failed to get container status \"27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2\": rpc error: code = NotFound desc = could not find container \"27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2\": container with ID starting with 27931458465a13e72788f87cbc8b654d38049cab2e1e500e5508e4b6b86f09b2 not found: ID does not exist" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.423711 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" (UID: "f0d54f6d-4531-4707-8c1a-aed5e0e36d0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.464290 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.466171 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-config-data" (OuterVolumeSpecName: "config-data") pod "f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" (UID: "f0d54f6d-4531-4707-8c1a-aed5e0e36d0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.479984 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.480622 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.480714 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v52w\" (UniqueName: \"kubernetes.io/projected/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e-kube-api-access-5v52w\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.480821 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrp8r\" (UniqueName: \"kubernetes.io/projected/dba0c99a-0f14-42bd-8822-ee79fc73ee41-kube-api-access-xrp8r\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.707514 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.737130 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.754065 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.775136 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.785860 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:10:46 crc kubenswrapper[4886]: E0129 17:10:46.786448 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba0c99a-0f14-42bd-8822-ee79fc73ee41" containerName="kube-state-metrics" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.786466 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba0c99a-0f14-42bd-8822-ee79fc73ee41" containerName="kube-state-metrics" Jan 29 17:10:46 crc kubenswrapper[4886]: E0129 17:10:46.786489 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" containerName="mysqld-exporter" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.786495 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" containerName="mysqld-exporter" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.786725 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="dba0c99a-0f14-42bd-8822-ee79fc73ee41" containerName="kube-state-metrics" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.786741 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" containerName="mysqld-exporter" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.787471 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.790059 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.791076 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.799399 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.801855 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.804281 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.804501 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.812631 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.825230 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.889735 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.889805 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.889839 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.889983 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.890273 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6npsv\" (UniqueName: \"kubernetes.io/projected/fa42ea64-73bc-439c-802c-65ef65a39015-kube-api-access-6npsv\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.890469 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.890506 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-config-data\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.890604 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x9qd\" (UniqueName: \"kubernetes.io/projected/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-kube-api-access-8x9qd\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.992774 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.992836 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.992878 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.992955 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.993057 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6npsv\" (UniqueName: \"kubernetes.io/projected/fa42ea64-73bc-439c-802c-65ef65a39015-kube-api-access-6npsv\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.993108 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.993133 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-config-data\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:46 crc kubenswrapper[4886]: I0129 17:10:46.993176 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x9qd\" (UniqueName: \"kubernetes.io/projected/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-kube-api-access-8x9qd\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.000426 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.000505 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.000680 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa42ea64-73bc-439c-802c-65ef65a39015-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.000677 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-config-data\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.001278 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.001275 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.014233 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6npsv\" (UniqueName: \"kubernetes.io/projected/fa42ea64-73bc-439c-802c-65ef65a39015-kube-api-access-6npsv\") pod \"kube-state-metrics-0\" (UID: \"fa42ea64-73bc-439c-802c-65ef65a39015\") " pod="openstack/kube-state-metrics-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.018654 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x9qd\" (UniqueName: \"kubernetes.io/projected/aa7423ef-f68a-4969-a81b-fd2ce4dbc16a-kube-api-access-8x9qd\") pod \"mysqld-exporter-0\" (UID: \"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a\") " pod="openstack/mysqld-exporter-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.106702 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.122885 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.451479 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.505732 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-84jbh"] Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.630099 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 29 17:10:47 crc kubenswrapper[4886]: W0129 17:10:47.705954 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa42ea64_73bc_439c_802c_65ef65a39015.slice/crio-aeb8ad0df2cd782d683a3fd7adf10093560785121b51ab0a6e3cded974fa6ebc WatchSource:0}: Error finding container aeb8ad0df2cd782d683a3fd7adf10093560785121b51ab0a6e3cded974fa6ebc: Status 404 returned error can't find the container with id aeb8ad0df2cd782d683a3fd7adf10093560785121b51ab0a6e3cded974fa6ebc Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.713752 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.745745 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.746086 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="ceilometer-central-agent" containerID="cri-o://c9c0e47c6badbee636eb54a74034a0d58d79d9a5f007d41423ec32b132adc41e" gracePeriod=30 Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.746117 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="proxy-httpd" containerID="cri-o://01c6694fd4df1d797b97e25cbe9f80e6eca4f580fbbf77224f8cc99225251a03" gracePeriod=30 Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.746215 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="ceilometer-notification-agent" containerID="cri-o://af32cb3d4cad94fb3c21ee16283db0307dd6a80318541f4accfe0f6d97cb6b84" gracePeriod=30 Jan 29 17:10:47 crc kubenswrapper[4886]: I0129 17:10:47.746232 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="sg-core" containerID="cri-o://6c975034f363da994f8f028b9f44a46d5e4b43e5df94d066fa0723bd5320a3f5" gracePeriod=30 Jan 29 17:10:48 crc kubenswrapper[4886]: E0129 17:10:48.154562 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51203b48_4909_45b6_8c3a_296fc4ee639c.slice/crio-c9c0e47c6badbee636eb54a74034a0d58d79d9a5f007d41423ec32b132adc41e.scope\": RecentStats: unable to find data in memory cache]" Jan 29 17:10:48 crc kubenswrapper[4886]: E0129 17:10:48.154875 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51203b48_4909_45b6_8c3a_296fc4ee639c.slice/crio-c9c0e47c6badbee636eb54a74034a0d58d79d9a5f007d41423ec32b132adc41e.scope\": RecentStats: unable to find data in memory cache]" Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.397196 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a","Type":"ContainerStarted","Data":"f5ab3a3ee772dea947ca3ed38718c8080a48c601b6be4ec50b20a99fe3b6c247"} Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.405891 4886 generic.go:334] "Generic (PLEG): container finished" podID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerID="01c6694fd4df1d797b97e25cbe9f80e6eca4f580fbbf77224f8cc99225251a03" exitCode=0 Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.405924 4886 generic.go:334] "Generic (PLEG): container finished" podID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerID="6c975034f363da994f8f028b9f44a46d5e4b43e5df94d066fa0723bd5320a3f5" exitCode=2 Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.405953 4886 generic.go:334] "Generic (PLEG): container finished" podID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerID="c9c0e47c6badbee636eb54a74034a0d58d79d9a5f007d41423ec32b132adc41e" exitCode=0 Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.406006 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerDied","Data":"01c6694fd4df1d797b97e25cbe9f80e6eca4f580fbbf77224f8cc99225251a03"} Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.406031 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerDied","Data":"6c975034f363da994f8f028b9f44a46d5e4b43e5df94d066fa0723bd5320a3f5"} Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.406041 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerDied","Data":"c9c0e47c6badbee636eb54a74034a0d58d79d9a5f007d41423ec32b132adc41e"} Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.409756 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa42ea64-73bc-439c-802c-65ef65a39015","Type":"ContainerStarted","Data":"aeb8ad0df2cd782d683a3fd7adf10093560785121b51ab0a6e3cded974fa6ebc"} Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.629128 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba0c99a-0f14-42bd-8822-ee79fc73ee41" path="/var/lib/kubelet/pods/dba0c99a-0f14-42bd-8822-ee79fc73ee41/volumes" Jan 29 17:10:48 crc kubenswrapper[4886]: I0129 17:10:48.629765 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d54f6d-4531-4707-8c1a-aed5e0e36d0e" path="/var/lib/kubelet/pods/f0d54f6d-4531-4707-8c1a-aed5e0e36d0e/volumes" Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.425628 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"aa7423ef-f68a-4969-a81b-fd2ce4dbc16a","Type":"ContainerStarted","Data":"1d925b8305416bd0e78aa2573e9ee07015a937abb9a0ce8302b468d57f13c6b7"} Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.428112 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa42ea64-73bc-439c-802c-65ef65a39015","Type":"ContainerStarted","Data":"3ab4717e5b4649ebaf7fb0c6e6ca5e8969a97f1cd9b3dc4edfc0b5ab98c0de4c"} Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.428277 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-84jbh" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerName="registry-server" containerID="cri-o://1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1" gracePeriod=2 Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.447154 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.782854499 podStartE2EDuration="3.447132742s" podCreationTimestamp="2026-01-29 17:10:46 +0000 UTC" firstStartedPulling="2026-01-29 17:10:47.627980049 +0000 UTC m=+2930.536699321" lastFinishedPulling="2026-01-29 17:10:48.292258292 +0000 UTC m=+2931.200977564" observedRunningTime="2026-01-29 17:10:49.441282674 +0000 UTC m=+2932.350001966" watchObservedRunningTime="2026-01-29 17:10:49.447132742 +0000 UTC m=+2932.355852014" Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.473732 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.25827197 podStartE2EDuration="3.473714362s" podCreationTimestamp="2026-01-29 17:10:46 +0000 UTC" firstStartedPulling="2026-01-29 17:10:47.708191243 +0000 UTC m=+2930.616910535" lastFinishedPulling="2026-01-29 17:10:48.923633645 +0000 UTC m=+2931.832352927" observedRunningTime="2026-01-29 17:10:49.46524009 +0000 UTC m=+2932.373959372" watchObservedRunningTime="2026-01-29 17:10:49.473714362 +0000 UTC m=+2932.382433634" Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.656820 4886 scope.go:117] "RemoveContainer" containerID="6412eac490b1fbd3d0b00a59dd461a3eb98d94b486a8096aadd0a5be64624a01" Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.708467 4886 scope.go:117] "RemoveContainer" containerID="8d073617833fd03b3552145f85acbb902d34a0687d97b69de74b719dca519779" Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.756578 4886 scope.go:117] "RemoveContainer" containerID="6e26b828a472fc3b1df8fa1fda19373a058c84b6a577b9a6475d17f33176e5c8" Jan 29 17:10:49 crc kubenswrapper[4886]: I0129 17:10:49.946242 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.069365 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96q7f\" (UniqueName: \"kubernetes.io/projected/217e65b9-b1b5-4244-930b-b85bc2e0a948-kube-api-access-96q7f\") pod \"217e65b9-b1b5-4244-930b-b85bc2e0a948\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.069669 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-catalog-content\") pod \"217e65b9-b1b5-4244-930b-b85bc2e0a948\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.070146 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-utilities\") pod \"217e65b9-b1b5-4244-930b-b85bc2e0a948\" (UID: \"217e65b9-b1b5-4244-930b-b85bc2e0a948\") " Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.070777 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-utilities" (OuterVolumeSpecName: "utilities") pod "217e65b9-b1b5-4244-930b-b85bc2e0a948" (UID: "217e65b9-b1b5-4244-930b-b85bc2e0a948"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.071643 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.078050 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217e65b9-b1b5-4244-930b-b85bc2e0a948-kube-api-access-96q7f" (OuterVolumeSpecName: "kube-api-access-96q7f") pod "217e65b9-b1b5-4244-930b-b85bc2e0a948" (UID: "217e65b9-b1b5-4244-930b-b85bc2e0a948"). InnerVolumeSpecName "kube-api-access-96q7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.121463 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "217e65b9-b1b5-4244-930b-b85bc2e0a948" (UID: "217e65b9-b1b5-4244-930b-b85bc2e0a948"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.175257 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217e65b9-b1b5-4244-930b-b85bc2e0a948-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.175301 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96q7f\" (UniqueName: \"kubernetes.io/projected/217e65b9-b1b5-4244-930b-b85bc2e0a948-kube-api-access-96q7f\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.444736 4886 generic.go:334] "Generic (PLEG): container finished" podID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerID="1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1" exitCode=0 Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.446266 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84jbh" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.447499 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84jbh" event={"ID":"217e65b9-b1b5-4244-930b-b85bc2e0a948","Type":"ContainerDied","Data":"1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1"} Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.447570 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.447589 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84jbh" event={"ID":"217e65b9-b1b5-4244-930b-b85bc2e0a948","Type":"ContainerDied","Data":"de22974c151bb71b46851f3f7e77eea61bb7ffc33602315145dbd816afde3589"} Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.447606 4886 scope.go:117] "RemoveContainer" containerID="1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.493550 4886 scope.go:117] "RemoveContainer" containerID="a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.503395 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-84jbh"] Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.516681 4886 scope.go:117] "RemoveContainer" containerID="44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.518801 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-84jbh"] Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.535824 4886 scope.go:117] "RemoveContainer" containerID="1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1" Jan 29 17:10:50 crc kubenswrapper[4886]: E0129 17:10:50.536377 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1\": container with ID starting with 1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1 not found: ID does not exist" containerID="1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.536417 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1"} err="failed to get container status \"1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1\": rpc error: code = NotFound desc = could not find container \"1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1\": container with ID starting with 1cf121c05278d2a79fa62d807a2f7e30e9e3f7f37ffab83863f6b16765571bd1 not found: ID does not exist" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.536442 4886 scope.go:117] "RemoveContainer" containerID="a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146" Jan 29 17:10:50 crc kubenswrapper[4886]: E0129 17:10:50.536693 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146\": container with ID starting with a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146 not found: ID does not exist" containerID="a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.536718 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146"} err="failed to get container status \"a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146\": rpc error: code = NotFound desc = could not find container \"a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146\": container with ID starting with a73252860a50c52042a273920d1fa676ee207346afa4366e940b19fa67393146 not found: ID does not exist" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.536736 4886 scope.go:117] "RemoveContainer" containerID="44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707" Jan 29 17:10:50 crc kubenswrapper[4886]: E0129 17:10:50.536947 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707\": container with ID starting with 44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707 not found: ID does not exist" containerID="44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.536965 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707"} err="failed to get container status \"44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707\": rpc error: code = NotFound desc = could not find container \"44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707\": container with ID starting with 44adaef6a4c07eda3623f3ba09f063d46a2dbeeca14db313ce6dac3eb8544707 not found: ID does not exist" Jan 29 17:10:50 crc kubenswrapper[4886]: I0129 17:10:50.629436 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" path="/var/lib/kubelet/pods/217e65b9-b1b5-4244-930b-b85bc2e0a948/volumes" Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.469188 4886 generic.go:334] "Generic (PLEG): container finished" podID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerID="af32cb3d4cad94fb3c21ee16283db0307dd6a80318541f4accfe0f6d97cb6b84" exitCode=0 Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.469558 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerDied","Data":"af32cb3d4cad94fb3c21ee16283db0307dd6a80318541f4accfe0f6d97cb6b84"} Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.843044 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.982823 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-sg-core-conf-yaml\") pod \"51203b48-4909-45b6-8c3a-296fc4ee639c\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.982922 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-run-httpd\") pod \"51203b48-4909-45b6-8c3a-296fc4ee639c\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.983256 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77p6n\" (UniqueName: \"kubernetes.io/projected/51203b48-4909-45b6-8c3a-296fc4ee639c-kube-api-access-77p6n\") pod \"51203b48-4909-45b6-8c3a-296fc4ee639c\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.983395 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-config-data\") pod \"51203b48-4909-45b6-8c3a-296fc4ee639c\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.983420 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-scripts\") pod \"51203b48-4909-45b6-8c3a-296fc4ee639c\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.983424 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51203b48-4909-45b6-8c3a-296fc4ee639c" (UID: "51203b48-4909-45b6-8c3a-296fc4ee639c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.983496 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-combined-ca-bundle\") pod \"51203b48-4909-45b6-8c3a-296fc4ee639c\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.983545 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-log-httpd\") pod \"51203b48-4909-45b6-8c3a-296fc4ee639c\" (UID: \"51203b48-4909-45b6-8c3a-296fc4ee639c\") " Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.984236 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51203b48-4909-45b6-8c3a-296fc4ee639c" (UID: "51203b48-4909-45b6-8c3a-296fc4ee639c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.985395 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.985412 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51203b48-4909-45b6-8c3a-296fc4ee639c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:51 crc kubenswrapper[4886]: I0129 17:10:51.991579 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-scripts" (OuterVolumeSpecName: "scripts") pod "51203b48-4909-45b6-8c3a-296fc4ee639c" (UID: "51203b48-4909-45b6-8c3a-296fc4ee639c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.003556 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51203b48-4909-45b6-8c3a-296fc4ee639c-kube-api-access-77p6n" (OuterVolumeSpecName: "kube-api-access-77p6n") pod "51203b48-4909-45b6-8c3a-296fc4ee639c" (UID: "51203b48-4909-45b6-8c3a-296fc4ee639c"). InnerVolumeSpecName "kube-api-access-77p6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.014560 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51203b48-4909-45b6-8c3a-296fc4ee639c" (UID: "51203b48-4909-45b6-8c3a-296fc4ee639c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.082057 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51203b48-4909-45b6-8c3a-296fc4ee639c" (UID: "51203b48-4909-45b6-8c3a-296fc4ee639c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.087005 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77p6n\" (UniqueName: \"kubernetes.io/projected/51203b48-4909-45b6-8c3a-296fc4ee639c-kube-api-access-77p6n\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.087047 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.087062 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.087072 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.109074 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-config-data" (OuterVolumeSpecName: "config-data") pod "51203b48-4909-45b6-8c3a-296fc4ee639c" (UID: "51203b48-4909-45b6-8c3a-296fc4ee639c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.189249 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51203b48-4909-45b6-8c3a-296fc4ee639c-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.487171 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51203b48-4909-45b6-8c3a-296fc4ee639c","Type":"ContainerDied","Data":"de5f49918f6704400cdc2de0d7791eff23d5b705cf50d627099de407ae90448b"} Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.487225 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.487232 4886 scope.go:117] "RemoveContainer" containerID="01c6694fd4df1d797b97e25cbe9f80e6eca4f580fbbf77224f8cc99225251a03" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.510543 4886 scope.go:117] "RemoveContainer" containerID="6c975034f363da994f8f028b9f44a46d5e4b43e5df94d066fa0723bd5320a3f5" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.527988 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.564455 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.567217 4886 scope.go:117] "RemoveContainer" containerID="af32cb3d4cad94fb3c21ee16283db0307dd6a80318541f4accfe0f6d97cb6b84" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.583064 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:52 crc kubenswrapper[4886]: E0129 17:10:52.583727 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="ceilometer-notification-agent" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.583755 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="ceilometer-notification-agent" Jan 29 17:10:52 crc kubenswrapper[4886]: E0129 17:10:52.583775 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="ceilometer-central-agent" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.583785 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="ceilometer-central-agent" Jan 29 17:10:52 crc kubenswrapper[4886]: E0129 17:10:52.583813 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerName="registry-server" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.583822 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerName="registry-server" Jan 29 17:10:52 crc kubenswrapper[4886]: E0129 17:10:52.583836 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="sg-core" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.583844 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="sg-core" Jan 29 17:10:52 crc kubenswrapper[4886]: E0129 17:10:52.583866 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerName="extract-utilities" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.583875 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerName="extract-utilities" Jan 29 17:10:52 crc kubenswrapper[4886]: E0129 17:10:52.583891 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="proxy-httpd" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.583899 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="proxy-httpd" Jan 29 17:10:52 crc kubenswrapper[4886]: E0129 17:10:52.583918 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerName="extract-content" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.583928 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerName="extract-content" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.584210 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="ceilometer-notification-agent" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.584235 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="proxy-httpd" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.584257 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="sg-core" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.584267 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="217e65b9-b1b5-4244-930b-b85bc2e0a948" containerName="registry-server" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.584289 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" containerName="ceilometer-central-agent" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.587214 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.590022 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.590355 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.590603 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.599959 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.600022 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.600070 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.600099 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdzk\" (UniqueName: \"kubernetes.io/projected/23f9894b-5940-4f78-9062-719f7e7eca3a-kube-api-access-bqdzk\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.600143 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f9894b-5940-4f78-9062-719f7e7eca3a-log-httpd\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.600225 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-config-data\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.600300 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f9894b-5940-4f78-9062-719f7e7eca3a-run-httpd\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.600387 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-scripts\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.600889 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.613774 4886 scope.go:117] "RemoveContainer" containerID="c9c0e47c6badbee636eb54a74034a0d58d79d9a5f007d41423ec32b132adc41e" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.634016 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51203b48-4909-45b6-8c3a-296fc4ee639c" path="/var/lib/kubelet/pods/51203b48-4909-45b6-8c3a-296fc4ee639c/volumes" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.703240 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.703308 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdzk\" (UniqueName: \"kubernetes.io/projected/23f9894b-5940-4f78-9062-719f7e7eca3a-kube-api-access-bqdzk\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.703410 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f9894b-5940-4f78-9062-719f7e7eca3a-log-httpd\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.703529 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-config-data\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.703659 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f9894b-5940-4f78-9062-719f7e7eca3a-run-httpd\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.703779 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-scripts\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.703840 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.703913 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.704763 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f9894b-5940-4f78-9062-719f7e7eca3a-run-httpd\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.704962 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f9894b-5940-4f78-9062-719f7e7eca3a-log-httpd\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.708959 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.709088 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-scripts\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.710361 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-config-data\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.710366 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.711156 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f9894b-5940-4f78-9062-719f7e7eca3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.722641 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdzk\" (UniqueName: \"kubernetes.io/projected/23f9894b-5940-4f78-9062-719f7e7eca3a-kube-api-access-bqdzk\") pod \"ceilometer-0\" (UID: \"23f9894b-5940-4f78-9062-719f7e7eca3a\") " pod="openstack/ceilometer-0" Jan 29 17:10:52 crc kubenswrapper[4886]: I0129 17:10:52.929606 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 17:10:53 crc kubenswrapper[4886]: I0129 17:10:53.478170 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 17:10:53 crc kubenswrapper[4886]: I0129 17:10:53.527394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f9894b-5940-4f78-9062-719f7e7eca3a","Type":"ContainerStarted","Data":"d0da7ef4ede1584d49bc9408cce25318c17924d82696643df0b1e3e96c3c34f0"} Jan 29 17:10:54 crc kubenswrapper[4886]: I0129 17:10:54.539775 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f9894b-5940-4f78-9062-719f7e7eca3a","Type":"ContainerStarted","Data":"843c319a528bddd4c44aba6cc0736758be4c6e9ea9c94b4e1040657ccc80e6c7"} Jan 29 17:10:55 crc kubenswrapper[4886]: I0129 17:10:55.553353 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f9894b-5940-4f78-9062-719f7e7eca3a","Type":"ContainerStarted","Data":"ddb246caed2a5503ac0be66ecd7978cb4002333cea945243173364e30caf063f"} Jan 29 17:10:56 crc kubenswrapper[4886]: I0129 17:10:56.566615 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f9894b-5940-4f78-9062-719f7e7eca3a","Type":"ContainerStarted","Data":"aed5c9470747d60c829ea4caec4d37a15f4fec4d356c00fe4e8b2a5f3977bd48"} Jan 29 17:10:57 crc kubenswrapper[4886]: I0129 17:10:57.131588 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 29 17:10:58 crc kubenswrapper[4886]: I0129 17:10:58.586970 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f9894b-5940-4f78-9062-719f7e7eca3a","Type":"ContainerStarted","Data":"0e9e000088def39e8cd6869d2bf6cee480a2e648f4614f59664f4bcc0b5c282e"} Jan 29 17:10:58 crc kubenswrapper[4886]: I0129 17:10:58.587435 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 17:10:58 crc kubenswrapper[4886]: I0129 17:10:58.626296 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.358991151 podStartE2EDuration="6.626238711s" podCreationTimestamp="2026-01-29 17:10:52 +0000 UTC" firstStartedPulling="2026-01-29 17:10:53.481473978 +0000 UTC m=+2936.390193250" lastFinishedPulling="2026-01-29 17:10:57.748721538 +0000 UTC m=+2940.657440810" observedRunningTime="2026-01-29 17:10:58.614011422 +0000 UTC m=+2941.522730714" watchObservedRunningTime="2026-01-29 17:10:58.626238711 +0000 UTC m=+2941.534957983" Jan 29 17:11:22 crc kubenswrapper[4886]: I0129 17:11:22.966602 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 17:11:29 crc kubenswrapper[4886]: I0129 17:11:29.661669 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:11:29 crc kubenswrapper[4886]: I0129 17:11:29.662230 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:11:59 crc kubenswrapper[4886]: I0129 17:11:59.661770 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:11:59 crc kubenswrapper[4886]: I0129 17:11:59.662406 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:12:29 crc kubenswrapper[4886]: I0129 17:12:29.660893 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:12:29 crc kubenswrapper[4886]: I0129 17:12:29.661554 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:12:29 crc kubenswrapper[4886]: I0129 17:12:29.661617 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:12:29 crc kubenswrapper[4886]: I0129 17:12:29.662307 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:12:29 crc kubenswrapper[4886]: I0129 17:12:29.662404 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" gracePeriod=600 Jan 29 17:12:29 crc kubenswrapper[4886]: E0129 17:12:29.785481 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:12:30 crc kubenswrapper[4886]: I0129 17:12:30.721759 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" exitCode=0 Jan 29 17:12:30 crc kubenswrapper[4886]: I0129 17:12:30.721812 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d"} Jan 29 17:12:30 crc kubenswrapper[4886]: I0129 17:12:30.722116 4886 scope.go:117] "RemoveContainer" containerID="db3893b2fd9096a13f5744612d4a2bcbba80c7ed2ddb6ffa1307348c351b1963" Jan 29 17:12:30 crc kubenswrapper[4886]: I0129 17:12:30.723076 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:12:30 crc kubenswrapper[4886]: E0129 17:12:30.723604 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:12:44 crc kubenswrapper[4886]: I0129 17:12:44.615564 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:12:44 crc kubenswrapper[4886]: E0129 17:12:44.616287 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:12:50 crc kubenswrapper[4886]: I0129 17:12:50.638560 4886 scope.go:117] "RemoveContainer" containerID="33ad2a1126eff6cbb88ccc77df323fa1e654c5d2155c0985168da0fd53e1864a" Jan 29 17:12:50 crc kubenswrapper[4886]: I0129 17:12:50.698247 4886 scope.go:117] "RemoveContainer" containerID="fb8fc548f591be6e16630c1c9171e7ca1c4549f03107635ab3d54cf848daec39" Jan 29 17:12:50 crc kubenswrapper[4886]: I0129 17:12:50.727931 4886 scope.go:117] "RemoveContainer" containerID="95a7d3b8a9e32ae8ae2e3ef610040f7131916bc7de34db8cc1af0fec9c3ef960" Jan 29 17:12:50 crc kubenswrapper[4886]: I0129 17:12:50.749502 4886 scope.go:117] "RemoveContainer" containerID="9b68510df598b451ff2d4faad4a0af1636831487ecf72ad66ce874c635cd8d9e" Jan 29 17:12:58 crc kubenswrapper[4886]: I0129 17:12:58.623723 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:12:58 crc kubenswrapper[4886]: E0129 17:12:58.624445 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:13:12 crc kubenswrapper[4886]: I0129 17:13:12.616310 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:13:12 crc kubenswrapper[4886]: E0129 17:13:12.618861 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:13:25 crc kubenswrapper[4886]: I0129 17:13:25.615685 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:13:25 crc kubenswrapper[4886]: E0129 17:13:25.616907 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:13:38 crc kubenswrapper[4886]: I0129 17:13:38.622790 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:13:38 crc kubenswrapper[4886]: E0129 17:13:38.624137 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:13:51 crc kubenswrapper[4886]: I0129 17:13:51.094116 4886 scope.go:117] "RemoveContainer" containerID="bfb4e65e7631317b75e0b15c39b90031add550dcb40292d0be47c6410cfdc89e" Jan 29 17:13:51 crc kubenswrapper[4886]: I0129 17:13:51.122797 4886 scope.go:117] "RemoveContainer" containerID="2012816a934b66e60ffd90c59e1fa261b396b239468adba78a0dedfe4395c1be" Jan 29 17:13:51 crc kubenswrapper[4886]: I0129 17:13:51.156071 4886 scope.go:117] "RemoveContainer" containerID="794f8e0bf261a512c459ecf62c8c7c26bca5d60128a7b4f23734cabe8f7c898d" Jan 29 17:13:53 crc kubenswrapper[4886]: I0129 17:13:53.615244 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:13:53 crc kubenswrapper[4886]: E0129 17:13:53.616383 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:14:08 crc kubenswrapper[4886]: I0129 17:14:08.625130 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:14:08 crc kubenswrapper[4886]: E0129 17:14:08.626189 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:14:23 crc kubenswrapper[4886]: I0129 17:14:23.615027 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:14:23 crc kubenswrapper[4886]: E0129 17:14:23.615830 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:14:35 crc kubenswrapper[4886]: I0129 17:14:35.615786 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:14:35 crc kubenswrapper[4886]: E0129 17:14:35.618034 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:14:49 crc kubenswrapper[4886]: I0129 17:14:49.616455 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:14:49 crc kubenswrapper[4886]: E0129 17:14:49.617508 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:14:51 crc kubenswrapper[4886]: I0129 17:14:51.292022 4886 scope.go:117] "RemoveContainer" containerID="afb5da406ee3b16e59af7913d87b7d9742dbcfd595f22b00884d57064f6bdef1" Jan 29 17:14:51 crc kubenswrapper[4886]: I0129 17:14:51.329539 4886 scope.go:117] "RemoveContainer" containerID="62df5b8b647bd7eae2ddeb32c6165e5fc8cdbdb8c984d6b948088525b813e903" Jan 29 17:14:51 crc kubenswrapper[4886]: I0129 17:14:51.389928 4886 scope.go:117] "RemoveContainer" containerID="be55140e95fb2c7fd3a46b1ece79fa3d9132da294caa5ac8edf498151a8ce0b2" Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.074089 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-sgspp"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.086869 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-5ab6-account-create-update-4xrnn"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.100097 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f0b5-account-create-update-8b8vz"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.118756 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-sgspp"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.129320 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4vq4n"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.139125 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-00e3-account-create-update-5hhsj"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.149288 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d860-account-create-update-5kd66"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.159910 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mdvpb"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.172655 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-5ab6-account-create-update-4xrnn"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.182751 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d860-account-create-update-5kd66"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.191955 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-00e3-account-create-update-5hhsj"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.201686 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f0b5-account-create-update-8b8vz"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.212221 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mdvpb"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.226614 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4vq4n"] Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.626889 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29921ec8-f68f-4547-a2c0-d4d3f5de6960" path="/var/lib/kubelet/pods/29921ec8-f68f-4547-a2c0-d4d3f5de6960/volumes" Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.627595 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c16915-30cc-4a4f-81ff-4b82cf152968" path="/var/lib/kubelet/pods/66c16915-30cc-4a4f-81ff-4b82cf152968/volumes" Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.628245 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d" path="/var/lib/kubelet/pods/6bcdded9-ad2a-4fcc-82f1-0a13cf85b06d/volumes" Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.630284 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c996a30-f53d-49f1-a7d1-2ca23704b48e" path="/var/lib/kubelet/pods/7c996a30-f53d-49f1-a7d1-2ca23704b48e/volumes" Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.631751 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe" path="/var/lib/kubelet/pods/9c4e1c71-a857-4feb-8778-ba3aa8b7dbfe/volumes" Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.632565 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa302a57-5c6b-41b1-ac4b-7d9095b7b65a" path="/var/lib/kubelet/pods/aa302a57-5c6b-41b1-ac4b-7d9095b7b65a/volumes" Jan 29 17:14:56 crc kubenswrapper[4886]: I0129 17:14:56.633181 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b696cd6b-840b-4505-9010-114d223a90e9" path="/var/lib/kubelet/pods/b696cd6b-840b-4505-9010-114d223a90e9/volumes" Jan 29 17:14:57 crc kubenswrapper[4886]: I0129 17:14:57.030535 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fw887"] Jan 29 17:14:57 crc kubenswrapper[4886]: I0129 17:14:57.040363 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fw887"] Jan 29 17:14:58 crc kubenswrapper[4886]: I0129 17:14:58.629186 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6479af73-81ef-4755-89b5-3a2dd44e99b3" path="/var/lib/kubelet/pods/6479af73-81ef-4755-89b5-3a2dd44e99b3/volumes" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.154307 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz"] Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.156342 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.158535 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.160291 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.174395 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz"] Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.273561 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hj4n\" (UniqueName: \"kubernetes.io/projected/875b9b50-c440-4567-b475-c890d3d5d713-kube-api-access-4hj4n\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.274677 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/875b9b50-c440-4567-b475-c890d3d5d713-config-volume\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.274855 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/875b9b50-c440-4567-b475-c890d3d5d713-secret-volume\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.377320 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hj4n\" (UniqueName: \"kubernetes.io/projected/875b9b50-c440-4567-b475-c890d3d5d713-kube-api-access-4hj4n\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.377468 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/875b9b50-c440-4567-b475-c890d3d5d713-config-volume\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.377547 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/875b9b50-c440-4567-b475-c890d3d5d713-secret-volume\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.378416 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/875b9b50-c440-4567-b475-c890d3d5d713-config-volume\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.382792 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/875b9b50-c440-4567-b475-c890d3d5d713-secret-volume\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.392500 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hj4n\" (UniqueName: \"kubernetes.io/projected/875b9b50-c440-4567-b475-c890d3d5d713-kube-api-access-4hj4n\") pod \"collect-profiles-29495115-pkxcz\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.477743 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:00 crc kubenswrapper[4886]: I0129 17:15:00.956903 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz"] Jan 29 17:15:01 crc kubenswrapper[4886]: I0129 17:15:01.408245 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" event={"ID":"875b9b50-c440-4567-b475-c890d3d5d713","Type":"ContainerStarted","Data":"db3e3f16f0932c632a2ab1ffff0f92252979a66c9e52244934f9d97bdd89246b"} Jan 29 17:15:01 crc kubenswrapper[4886]: I0129 17:15:01.408589 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" event={"ID":"875b9b50-c440-4567-b475-c890d3d5d713","Type":"ContainerStarted","Data":"e7264abfdb40ca1553c323e488eb75e4e7925d55c85c24f0028a060cfbb82eff"} Jan 29 17:15:01 crc kubenswrapper[4886]: I0129 17:15:01.432950 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" podStartSLOduration=1.432912639 podStartE2EDuration="1.432912639s" podCreationTimestamp="2026-01-29 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:15:01.423655886 +0000 UTC m=+3184.332375208" watchObservedRunningTime="2026-01-29 17:15:01.432912639 +0000 UTC m=+3184.341631911" Jan 29 17:15:02 crc kubenswrapper[4886]: I0129 17:15:02.420779 4886 generic.go:334] "Generic (PLEG): container finished" podID="875b9b50-c440-4567-b475-c890d3d5d713" containerID="db3e3f16f0932c632a2ab1ffff0f92252979a66c9e52244934f9d97bdd89246b" exitCode=0 Jan 29 17:15:02 crc kubenswrapper[4886]: I0129 17:15:02.420999 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" event={"ID":"875b9b50-c440-4567-b475-c890d3d5d713","Type":"ContainerDied","Data":"db3e3f16f0932c632a2ab1ffff0f92252979a66c9e52244934f9d97bdd89246b"} Jan 29 17:15:03 crc kubenswrapper[4886]: I0129 17:15:03.847118 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:03 crc kubenswrapper[4886]: I0129 17:15:03.962516 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/875b9b50-c440-4567-b475-c890d3d5d713-config-volume\") pod \"875b9b50-c440-4567-b475-c890d3d5d713\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " Jan 29 17:15:03 crc kubenswrapper[4886]: I0129 17:15:03.962628 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/875b9b50-c440-4567-b475-c890d3d5d713-secret-volume\") pod \"875b9b50-c440-4567-b475-c890d3d5d713\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " Jan 29 17:15:03 crc kubenswrapper[4886]: I0129 17:15:03.962970 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hj4n\" (UniqueName: \"kubernetes.io/projected/875b9b50-c440-4567-b475-c890d3d5d713-kube-api-access-4hj4n\") pod \"875b9b50-c440-4567-b475-c890d3d5d713\" (UID: \"875b9b50-c440-4567-b475-c890d3d5d713\") " Jan 29 17:15:03 crc kubenswrapper[4886]: I0129 17:15:03.963428 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/875b9b50-c440-4567-b475-c890d3d5d713-config-volume" (OuterVolumeSpecName: "config-volume") pod "875b9b50-c440-4567-b475-c890d3d5d713" (UID: "875b9b50-c440-4567-b475-c890d3d5d713"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:15:03 crc kubenswrapper[4886]: I0129 17:15:03.963643 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/875b9b50-c440-4567-b475-c890d3d5d713-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 17:15:03 crc kubenswrapper[4886]: I0129 17:15:03.969586 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875b9b50-c440-4567-b475-c890d3d5d713-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "875b9b50-c440-4567-b475-c890d3d5d713" (UID: "875b9b50-c440-4567-b475-c890d3d5d713"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:15:03 crc kubenswrapper[4886]: I0129 17:15:03.969716 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875b9b50-c440-4567-b475-c890d3d5d713-kube-api-access-4hj4n" (OuterVolumeSpecName: "kube-api-access-4hj4n") pod "875b9b50-c440-4567-b475-c890d3d5d713" (UID: "875b9b50-c440-4567-b475-c890d3d5d713"). InnerVolumeSpecName "kube-api-access-4hj4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.055236 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xg8wq"] Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.066005 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/875b9b50-c440-4567-b475-c890d3d5d713-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.066049 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hj4n\" (UniqueName: \"kubernetes.io/projected/875b9b50-c440-4567-b475-c890d3d5d713-kube-api-access-4hj4n\") on node \"crc\" DevicePath \"\"" Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.067699 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xg8wq"] Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.444224 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" event={"ID":"875b9b50-c440-4567-b475-c890d3d5d713","Type":"ContainerDied","Data":"e7264abfdb40ca1553c323e488eb75e4e7925d55c85c24f0028a060cfbb82eff"} Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.444264 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7264abfdb40ca1553c323e488eb75e4e7925d55c85c24f0028a060cfbb82eff" Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.444300 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz" Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.494263 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9"] Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.504577 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495070-xnbx9"] Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.616850 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:15:04 crc kubenswrapper[4886]: E0129 17:15:04.617475 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.644765 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18290a86-b94a-42c5-9f50-1614077f881b" path="/var/lib/kubelet/pods/18290a86-b94a-42c5-9f50-1614077f881b/volumes" Jan 29 17:15:04 crc kubenswrapper[4886]: I0129 17:15:04.650547 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b94c98-0561-4135-a5af-023ef5f4ad67" path="/var/lib/kubelet/pods/40b94c98-0561-4135-a5af-023ef5f4ad67/volumes" Jan 29 17:15:06 crc kubenswrapper[4886]: I0129 17:15:06.028875 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-23ad-account-create-update-2dsmj"] Jan 29 17:15:06 crc kubenswrapper[4886]: I0129 17:15:06.040098 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4"] Jan 29 17:15:06 crc kubenswrapper[4886]: I0129 17:15:06.053997 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-23ad-account-create-update-2dsmj"] Jan 29 17:15:06 crc kubenswrapper[4886]: I0129 17:15:06.066714 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sl5h4"] Jan 29 17:15:06 crc kubenswrapper[4886]: I0129 17:15:06.628933 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ed1f90-1318-483e-901c-bff80e1e94b6" path="/var/lib/kubelet/pods/d2ed1f90-1318-483e-901c-bff80e1e94b6/volumes" Jan 29 17:15:06 crc kubenswrapper[4886]: I0129 17:15:06.629831 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a69a79-4e4c-4815-8cf5-0864ff2b8026" path="/var/lib/kubelet/pods/d8a69a79-4e4c-4815-8cf5-0864ff2b8026/volumes" Jan 29 17:15:17 crc kubenswrapper[4886]: I0129 17:15:17.618293 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:15:17 crc kubenswrapper[4886]: E0129 17:15:17.619953 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:15:32 crc kubenswrapper[4886]: I0129 17:15:32.615667 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:15:32 crc kubenswrapper[4886]: E0129 17:15:32.624911 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:15:43 crc kubenswrapper[4886]: I0129 17:15:43.615757 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:15:43 crc kubenswrapper[4886]: E0129 17:15:43.616518 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.070648 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-b8qfq"] Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.107353 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vvrp4"] Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.118576 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5m27f"] Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.128638 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vvrp4"] Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.139566 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-b8qfq"] Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.150032 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5m27f"] Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.627702 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219e979e-b3a8-42d0-8f23-737a86a2aefb" path="/var/lib/kubelet/pods/219e979e-b3a8-42d0-8f23-737a86a2aefb/volumes" Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.628792 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61eedb40-ed14-42aa-9751-8bedcd699260" path="/var/lib/kubelet/pods/61eedb40-ed14-42aa-9751-8bedcd699260/volumes" Jan 29 17:15:46 crc kubenswrapper[4886]: I0129 17:15:46.629719 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca25333-29b2-4c38-9e85-ebd2a0d593d6" path="/var/lib/kubelet/pods/eca25333-29b2-4c38-9e85-ebd2a0d593d6/volumes" Jan 29 17:15:49 crc kubenswrapper[4886]: I0129 17:15:49.027748 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mj8rv"] Jan 29 17:15:49 crc kubenswrapper[4886]: I0129 17:15:49.041502 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mj8rv"] Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.420713 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bv9pm"] Jan 29 17:15:50 crc kubenswrapper[4886]: E0129 17:15:50.421718 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875b9b50-c440-4567-b475-c890d3d5d713" containerName="collect-profiles" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.421737 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="875b9b50-c440-4567-b475-c890d3d5d713" containerName="collect-profiles" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.422052 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="875b9b50-c440-4567-b475-c890d3d5d713" containerName="collect-profiles" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.424458 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.432393 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bv9pm"] Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.536807 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-catalog-content\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.537080 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-utilities\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.537123 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rq4p\" (UniqueName: \"kubernetes.io/projected/1f773961-b526-4457-870c-ac299a3e3312-kube-api-access-6rq4p\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.639603 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34bb765-0998-45ea-bb61-9fbbc2c7359d" path="/var/lib/kubelet/pods/f34bb765-0998-45ea-bb61-9fbbc2c7359d/volumes" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.640201 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-utilities\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.640261 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rq4p\" (UniqueName: \"kubernetes.io/projected/1f773961-b526-4457-870c-ac299a3e3312-kube-api-access-6rq4p\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.640402 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-catalog-content\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.640867 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-utilities\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.640875 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-catalog-content\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.662772 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rq4p\" (UniqueName: \"kubernetes.io/projected/1f773961-b526-4457-870c-ac299a3e3312-kube-api-access-6rq4p\") pod \"redhat-operators-bv9pm\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:50 crc kubenswrapper[4886]: I0129 17:15:50.746869 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.510859 4886 scope.go:117] "RemoveContainer" containerID="78746abbdca4d80f0a57707d5af0310c508403ee469b611bd3861cf01570354a" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.511409 4886 patch_prober.go:28] interesting pod/logging-loki-distributor-5f678c8dd6-2jzzb container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/ready\": context deadline exceeded" start-of-body= Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.511456 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-2jzzb" podUID="befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.55:3101/ready\": context deadline exceeded" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.581666 4886 scope.go:117] "RemoveContainer" containerID="ce7bb70d8d66605a00b65db196f138b8d093db85ba2aba770dcd073411b5b8b4" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.636015 4886 scope.go:117] "RemoveContainer" containerID="2706075df7ed398bfa86a5019c0c0b891534965545aed4044f6858df83babfa9" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.713830 4886 scope.go:117] "RemoveContainer" containerID="bb6b6c4443538f6a82366349284b39cf96fcba5ff7da991fc88f83ec4dbea3cd" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.785076 4886 scope.go:117] "RemoveContainer" containerID="3a64bd79066ba13789ce6be118a26c29652e1e5c788ad39a1b41f13dad0dd1c1" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.814959 4886 scope.go:117] "RemoveContainer" containerID="20030a467bab27996b15106f17b7491349b629c6d6de493fc3b1efb1f226e72c" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.847676 4886 scope.go:117] "RemoveContainer" containerID="9211a739518fb120e2bda32757d910dcbc67d03a2ddbfea02f5bc9964d2f0a2d" Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.894350 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bv9pm"] Jan 29 17:15:51 crc kubenswrapper[4886]: I0129 17:15:51.920693 4886 scope.go:117] "RemoveContainer" containerID="2e89a5a701ca89a4fedcbc0c8d956d6d340377591f80cf75f3cdedc6fb2cd6f3" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.004215 4886 scope.go:117] "RemoveContainer" containerID="5f38a23b3e231c3670461bd30eb72fab48714dac00ff0dbd8042edb99ce295c4" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.058768 4886 scope.go:117] "RemoveContainer" containerID="c217cd04d2dba654b23c94e4b5b9acb5912a4546fafe4781e26a2d0d53058004" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.097136 4886 scope.go:117] "RemoveContainer" containerID="5019558a9253bbef2f27d289d48dcc75d2b0f7a1469d88aa8fb186da0d61df99" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.139461 4886 scope.go:117] "RemoveContainer" containerID="cbbd4f5360c0e0e269db9be0e3b0c9d872ff0fa28897b05c76dba7a51c4b1e4c" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.165583 4886 scope.go:117] "RemoveContainer" containerID="fbecb6255a3f2d33607adb71963134e7eb4f057014a12ad026702a5429304db4" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.218215 4886 scope.go:117] "RemoveContainer" containerID="dae301d02f31a6be0962a543705953e6d92f427e7aa9bc8443d7688a4f7705a4" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.250162 4886 scope.go:117] "RemoveContainer" containerID="ef7ef7e1c633f815512fbc83adaa9bb46d23ddf73eb8c93c02d1c3c3b64a5fcf" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.280828 4886 scope.go:117] "RemoveContainer" containerID="11300dda6841f3bcadbf8fc0b293c71f220072872935dad2eeec46ba483d2773" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.324888 4886 scope.go:117] "RemoveContainer" containerID="d34996a936f771ac75eec769fb4795e0b3637c5867ba052c3b34c2c7b2aee667" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.354909 4886 scope.go:117] "RemoveContainer" containerID="0341a2566f1bb6385e4ca19bd7599e154fd2818c69290a143a8dae194ef6f346" Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.563270 4886 generic.go:334] "Generic (PLEG): container finished" podID="1f773961-b526-4457-870c-ac299a3e3312" containerID="398572a57dafba1fd44cb5ba23bcfc932f80aa853a274d63caeb2dea379597de" exitCode=0 Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.563373 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bv9pm" event={"ID":"1f773961-b526-4457-870c-ac299a3e3312","Type":"ContainerDied","Data":"398572a57dafba1fd44cb5ba23bcfc932f80aa853a274d63caeb2dea379597de"} Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.563422 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bv9pm" event={"ID":"1f773961-b526-4457-870c-ac299a3e3312","Type":"ContainerStarted","Data":"1cee306e378c6e1adda858d2bbd9e36da757769a33d53cb9a3ec25090fcac3dd"} Jan 29 17:15:52 crc kubenswrapper[4886]: I0129 17:15:52.569483 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:15:54 crc kubenswrapper[4886]: I0129 17:15:54.594735 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bv9pm" event={"ID":"1f773961-b526-4457-870c-ac299a3e3312","Type":"ContainerStarted","Data":"b9e457fec0b46000ce1469c5ea146165937abd84d20dd0308ec4a5fc11ab5a73"} Jan 29 17:15:56 crc kubenswrapper[4886]: I0129 17:15:56.616684 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:15:56 crc kubenswrapper[4886]: E0129 17:15:56.617599 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:15:58 crc kubenswrapper[4886]: I0129 17:15:58.035692 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-bd38-account-create-update-rgmr5"] Jan 29 17:15:58 crc kubenswrapper[4886]: I0129 17:15:58.046842 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-bd38-account-create-update-rgmr5"] Jan 29 17:15:58 crc kubenswrapper[4886]: I0129 17:15:58.648321 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31fe7aa-0ad1-44ef-a748-b4f366a4d374" path="/var/lib/kubelet/pods/c31fe7aa-0ad1-44ef-a748-b4f366a4d374/volumes" Jan 29 17:15:59 crc kubenswrapper[4886]: I0129 17:15:59.662873 4886 generic.go:334] "Generic (PLEG): container finished" podID="1f773961-b526-4457-870c-ac299a3e3312" containerID="b9e457fec0b46000ce1469c5ea146165937abd84d20dd0308ec4a5fc11ab5a73" exitCode=0 Jan 29 17:15:59 crc kubenswrapper[4886]: I0129 17:15:59.662914 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bv9pm" event={"ID":"1f773961-b526-4457-870c-ac299a3e3312","Type":"ContainerDied","Data":"b9e457fec0b46000ce1469c5ea146165937abd84d20dd0308ec4a5fc11ab5a73"} Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.033177 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-4501-account-create-update-hj72z"] Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.045214 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-70c1-account-create-update-gwzzv"] Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.055569 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-4501-account-create-update-hj72z"] Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.065129 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e433-account-create-update-qm5sx"] Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.074156 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e433-account-create-update-qm5sx"] Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.085247 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-70c1-account-create-update-gwzzv"] Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.627131 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3dc785-5f55-49ca-8678-5105ba7e0568" path="/var/lib/kubelet/pods/2b3dc785-5f55-49ca-8678-5105ba7e0568/volumes" Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.627930 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95df3f15-8d1d-4baf-bbb6-df4939f0d201" path="/var/lib/kubelet/pods/95df3f15-8d1d-4baf-bbb6-df4939f0d201/volumes" Jan 29 17:16:00 crc kubenswrapper[4886]: I0129 17:16:00.628644 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e697ee-193d-4ce1-9905-cebf2e6ba7ff" path="/var/lib/kubelet/pods/b8e697ee-193d-4ce1-9905-cebf2e6ba7ff/volumes" Jan 29 17:16:01 crc kubenswrapper[4886]: I0129 17:16:01.685152 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bv9pm" event={"ID":"1f773961-b526-4457-870c-ac299a3e3312","Type":"ContainerStarted","Data":"1a0aa82fdb2a0a8ce345b81c0f3dabeccc7dfaf4d0119db5450b96bd81c1f459"} Jan 29 17:16:01 crc kubenswrapper[4886]: I0129 17:16:01.710693 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bv9pm" podStartSLOduration=3.374734529 podStartE2EDuration="11.71067789s" podCreationTimestamp="2026-01-29 17:15:50 +0000 UTC" firstStartedPulling="2026-01-29 17:15:52.569166298 +0000 UTC m=+3235.477885570" lastFinishedPulling="2026-01-29 17:16:00.905109659 +0000 UTC m=+3243.813828931" observedRunningTime="2026-01-29 17:16:01.707107379 +0000 UTC m=+3244.615826681" watchObservedRunningTime="2026-01-29 17:16:01.71067789 +0000 UTC m=+3244.619397162" Jan 29 17:16:03 crc kubenswrapper[4886]: I0129 17:16:03.035401 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8whvl"] Jan 29 17:16:03 crc kubenswrapper[4886]: I0129 17:16:03.048019 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8whvl"] Jan 29 17:16:04 crc kubenswrapper[4886]: I0129 17:16:04.629844 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c9729b7-e21b-4509-b337-618094fb2d52" path="/var/lib/kubelet/pods/6c9729b7-e21b-4509-b337-618094fb2d52/volumes" Jan 29 17:16:07 crc kubenswrapper[4886]: I0129 17:16:07.616402 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:16:07 crc kubenswrapper[4886]: E0129 17:16:07.617495 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:16:10 crc kubenswrapper[4886]: I0129 17:16:10.748004 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:16:10 crc kubenswrapper[4886]: I0129 17:16:10.748421 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:16:11 crc kubenswrapper[4886]: I0129 17:16:11.804879 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bv9pm" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="registry-server" probeResult="failure" output=< Jan 29 17:16:11 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:16:11 crc kubenswrapper[4886]: > Jan 29 17:16:14 crc kubenswrapper[4886]: I0129 17:16:14.044132 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-thqn5"] Jan 29 17:16:14 crc kubenswrapper[4886]: I0129 17:16:14.053607 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-thqn5"] Jan 29 17:16:14 crc kubenswrapper[4886]: I0129 17:16:14.629030 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f114908-5594-4378-939f-f54b2157d676" path="/var/lib/kubelet/pods/9f114908-5594-4378-939f-f54b2157d676/volumes" Jan 29 17:16:20 crc kubenswrapper[4886]: I0129 17:16:20.617797 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:16:20 crc kubenswrapper[4886]: E0129 17:16:20.618961 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:16:21 crc kubenswrapper[4886]: I0129 17:16:21.825510 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bv9pm" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="registry-server" probeResult="failure" output=< Jan 29 17:16:21 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:16:21 crc kubenswrapper[4886]: > Jan 29 17:16:30 crc kubenswrapper[4886]: I0129 17:16:30.802866 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:16:30 crc kubenswrapper[4886]: I0129 17:16:30.851375 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:16:33 crc kubenswrapper[4886]: I0129 17:16:33.615458 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:16:33 crc kubenswrapper[4886]: E0129 17:16:33.616274 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:16:34 crc kubenswrapper[4886]: I0129 17:16:34.418013 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bv9pm"] Jan 29 17:16:34 crc kubenswrapper[4886]: I0129 17:16:34.418274 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bv9pm" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="registry-server" containerID="cri-o://1a0aa82fdb2a0a8ce345b81c0f3dabeccc7dfaf4d0119db5450b96bd81c1f459" gracePeriod=2 Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.026293 4886 generic.go:334] "Generic (PLEG): container finished" podID="1f773961-b526-4457-870c-ac299a3e3312" containerID="1a0aa82fdb2a0a8ce345b81c0f3dabeccc7dfaf4d0119db5450b96bd81c1f459" exitCode=0 Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.026405 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bv9pm" event={"ID":"1f773961-b526-4457-870c-ac299a3e3312","Type":"ContainerDied","Data":"1a0aa82fdb2a0a8ce345b81c0f3dabeccc7dfaf4d0119db5450b96bd81c1f459"} Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.668099 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.850340 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rq4p\" (UniqueName: \"kubernetes.io/projected/1f773961-b526-4457-870c-ac299a3e3312-kube-api-access-6rq4p\") pod \"1f773961-b526-4457-870c-ac299a3e3312\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.850544 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-utilities\") pod \"1f773961-b526-4457-870c-ac299a3e3312\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.850569 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-catalog-content\") pod \"1f773961-b526-4457-870c-ac299a3e3312\" (UID: \"1f773961-b526-4457-870c-ac299a3e3312\") " Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.851450 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-utilities" (OuterVolumeSpecName: "utilities") pod "1f773961-b526-4457-870c-ac299a3e3312" (UID: "1f773961-b526-4457-870c-ac299a3e3312"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.857473 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f773961-b526-4457-870c-ac299a3e3312-kube-api-access-6rq4p" (OuterVolumeSpecName: "kube-api-access-6rq4p") pod "1f773961-b526-4457-870c-ac299a3e3312" (UID: "1f773961-b526-4457-870c-ac299a3e3312"). InnerVolumeSpecName "kube-api-access-6rq4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.953446 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.953477 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rq4p\" (UniqueName: \"kubernetes.io/projected/1f773961-b526-4457-870c-ac299a3e3312-kube-api-access-6rq4p\") on node \"crc\" DevicePath \"\"" Jan 29 17:16:35 crc kubenswrapper[4886]: I0129 17:16:35.963691 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f773961-b526-4457-870c-ac299a3e3312" (UID: "1f773961-b526-4457-870c-ac299a3e3312"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.040953 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bv9pm" event={"ID":"1f773961-b526-4457-870c-ac299a3e3312","Type":"ContainerDied","Data":"1cee306e378c6e1adda858d2bbd9e36da757769a33d53cb9a3ec25090fcac3dd"} Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.041021 4886 scope.go:117] "RemoveContainer" containerID="1a0aa82fdb2a0a8ce345b81c0f3dabeccc7dfaf4d0119db5450b96bd81c1f459" Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.042503 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bv9pm" Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.056347 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f773961-b526-4457-870c-ac299a3e3312-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.090758 4886 scope.go:117] "RemoveContainer" containerID="b9e457fec0b46000ce1469c5ea146165937abd84d20dd0308ec4a5fc11ab5a73" Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.097390 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bv9pm"] Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.108440 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bv9pm"] Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.122575 4886 scope.go:117] "RemoveContainer" containerID="398572a57dafba1fd44cb5ba23bcfc932f80aa853a274d63caeb2dea379597de" Jan 29 17:16:36 crc kubenswrapper[4886]: I0129 17:16:36.639718 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f773961-b526-4457-870c-ac299a3e3312" path="/var/lib/kubelet/pods/1f773961-b526-4457-870c-ac299a3e3312/volumes" Jan 29 17:16:48 crc kubenswrapper[4886]: I0129 17:16:48.629292 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:16:48 crc kubenswrapper[4886]: E0129 17:16:48.630155 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:16:52 crc kubenswrapper[4886]: I0129 17:16:52.819674 4886 scope.go:117] "RemoveContainer" containerID="c6fd592bb372f4bd56073a5709a8ef40ff848343cbd26b66d1e162d12eab6737" Jan 29 17:16:52 crc kubenswrapper[4886]: I0129 17:16:52.859501 4886 scope.go:117] "RemoveContainer" containerID="5279babaff011b0a7c0724784680ba960a9fce4465f977efe275f3b290d89fab" Jan 29 17:16:52 crc kubenswrapper[4886]: I0129 17:16:52.908817 4886 scope.go:117] "RemoveContainer" containerID="297512a17905e8884ba2dee2e1bd0e97f5fbde7e67ab2e041189401e3a8b1069" Jan 29 17:16:52 crc kubenswrapper[4886]: I0129 17:16:52.930582 4886 scope.go:117] "RemoveContainer" containerID="76e9fd9551f88713599d793f819bec47fc38185510d47fbd152e0939943ac037" Jan 29 17:16:52 crc kubenswrapper[4886]: I0129 17:16:52.994269 4886 scope.go:117] "RemoveContainer" containerID="1b2a63dcfed7450a36197cbdc154c29e365ef6be50e63a79bd321d9e35afd21f" Jan 29 17:16:53 crc kubenswrapper[4886]: I0129 17:16:53.032677 4886 scope.go:117] "RemoveContainer" containerID="c0779e333572b6cd2f4e3dc26dcb63d1cb95b806d59884314b143132c6990518" Jan 29 17:16:53 crc kubenswrapper[4886]: I0129 17:16:53.095959 4886 scope.go:117] "RemoveContainer" containerID="05a52ecdbf485c6c724d9a992c69aca83958ea1704df0dac8409ddf6fbc7b4d1" Jan 29 17:16:53 crc kubenswrapper[4886]: I0129 17:16:53.147720 4886 scope.go:117] "RemoveContainer" containerID="e61c63ed7fdb0d740a758c779dfae1d17126672ffa65adff6cc5cd29f6bcc51c" Jan 29 17:17:03 crc kubenswrapper[4886]: I0129 17:17:03.616764 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:17:03 crc kubenswrapper[4886]: E0129 17:17:03.617316 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:17:15 crc kubenswrapper[4886]: I0129 17:17:15.056220 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-p924n"] Jan 29 17:17:15 crc kubenswrapper[4886]: I0129 17:17:15.071053 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-p924n"] Jan 29 17:17:16 crc kubenswrapper[4886]: I0129 17:17:16.615438 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:17:16 crc kubenswrapper[4886]: E0129 17:17:16.616781 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:17:16 crc kubenswrapper[4886]: I0129 17:17:16.646490 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68cdc6ed-ce63-43af-8502-b36cc0ae788a" path="/var/lib/kubelet/pods/68cdc6ed-ce63-43af-8502-b36cc0ae788a/volumes" Jan 29 17:17:18 crc kubenswrapper[4886]: I0129 17:17:18.057496 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8m2mm"] Jan 29 17:17:18 crc kubenswrapper[4886]: I0129 17:17:18.074083 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8m2mm"] Jan 29 17:17:18 crc kubenswrapper[4886]: I0129 17:17:18.629613 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8923ac96-087a-425b-a8b4-c09aa4be3d78" path="/var/lib/kubelet/pods/8923ac96-087a-425b-a8b4-c09aa4be3d78/volumes" Jan 29 17:17:30 crc kubenswrapper[4886]: I0129 17:17:30.617062 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:17:31 crc kubenswrapper[4886]: I0129 17:17:31.724381 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"bd2f023886beead4933eaa92185559b0b9421864121dccb5c51a6c3ddd9cce35"} Jan 29 17:17:34 crc kubenswrapper[4886]: I0129 17:17:34.055483 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-q2dxw"] Jan 29 17:17:34 crc kubenswrapper[4886]: I0129 17:17:34.068597 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-q2dxw"] Jan 29 17:17:34 crc kubenswrapper[4886]: I0129 17:17:34.631855 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb099fb-7bdb-4969-b3cb-6fc4ef498afd" path="/var/lib/kubelet/pods/ffb099fb-7bdb-4969-b3cb-6fc4ef498afd/volumes" Jan 29 17:17:38 crc kubenswrapper[4886]: I0129 17:17:38.058600 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-6nmwn"] Jan 29 17:17:38 crc kubenswrapper[4886]: I0129 17:17:38.070496 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-6nmwn"] Jan 29 17:17:38 crc kubenswrapper[4886]: I0129 17:17:38.627187 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0058f32-ae80-4dde-9dce-095c62f45979" path="/var/lib/kubelet/pods/a0058f32-ae80-4dde-9dce-095c62f45979/volumes" Jan 29 17:17:44 crc kubenswrapper[4886]: I0129 17:17:44.064787 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-j5gfz"] Jan 29 17:17:44 crc kubenswrapper[4886]: I0129 17:17:44.076522 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-j5gfz"] Jan 29 17:17:44 crc kubenswrapper[4886]: I0129 17:17:44.630418 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04dae116-ceca-4588-9cba-1266bfa92caf" path="/var/lib/kubelet/pods/04dae116-ceca-4588-9cba-1266bfa92caf/volumes" Jan 29 17:17:53 crc kubenswrapper[4886]: I0129 17:17:53.032997 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qglhp"] Jan 29 17:17:53 crc kubenswrapper[4886]: I0129 17:17:53.041601 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qglhp"] Jan 29 17:17:53 crc kubenswrapper[4886]: I0129 17:17:53.476040 4886 scope.go:117] "RemoveContainer" containerID="ab83d2d0c36aaea48832e86668e20e1d6f6f876644014c27f52bee83b6960b7d" Jan 29 17:17:53 crc kubenswrapper[4886]: I0129 17:17:53.508729 4886 scope.go:117] "RemoveContainer" containerID="b56f617415d312996740dc4a8697ef643e749e77f4339179492aab6c12f2f0d4" Jan 29 17:17:53 crc kubenswrapper[4886]: I0129 17:17:53.569593 4886 scope.go:117] "RemoveContainer" containerID="6375ad3e949f813db64562de4e61fa2910abcb717d2e211c509e5dbcb6b07f3a" Jan 29 17:17:53 crc kubenswrapper[4886]: I0129 17:17:53.641159 4886 scope.go:117] "RemoveContainer" containerID="09a30c5dfcb3deacf09e3ccec1c515a8213db072a4cbe06ac44ba60b9a7d0159" Jan 29 17:17:53 crc kubenswrapper[4886]: I0129 17:17:53.691040 4886 scope.go:117] "RemoveContainer" containerID="462d0b69d42ff5bdae3194985f827b482bb0c2607dbc772e35d27e51d1171c94" Jan 29 17:17:54 crc kubenswrapper[4886]: I0129 17:17:54.627447 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43da0665-7e6a-4176-ae84-71128a89a243" path="/var/lib/kubelet/pods/43da0665-7e6a-4176-ae84-71128a89a243/volumes" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.187020 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bfhgd"] Jan 29 17:18:15 crc kubenswrapper[4886]: E0129 17:18:15.188138 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="extract-content" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.188152 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="extract-content" Jan 29 17:18:15 crc kubenswrapper[4886]: E0129 17:18:15.188186 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="registry-server" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.188194 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="registry-server" Jan 29 17:18:15 crc kubenswrapper[4886]: E0129 17:18:15.188207 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="extract-utilities" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.188215 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="extract-utilities" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.188469 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f773961-b526-4457-870c-ac299a3e3312" containerName="registry-server" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.190198 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.198057 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bfhgd"] Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.338564 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddpvb\" (UniqueName: \"kubernetes.io/projected/3de4fb0c-479a-43eb-bf0e-910c8993247d-kube-api-access-ddpvb\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.338730 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-utilities\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.338777 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-catalog-content\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.440710 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-utilities\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.440793 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-catalog-content\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.440925 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddpvb\" (UniqueName: \"kubernetes.io/projected/3de4fb0c-479a-43eb-bf0e-910c8993247d-kube-api-access-ddpvb\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.441398 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-utilities\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.441645 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-catalog-content\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.465748 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddpvb\" (UniqueName: \"kubernetes.io/projected/3de4fb0c-479a-43eb-bf0e-910c8993247d-kube-api-access-ddpvb\") pod \"certified-operators-bfhgd\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:15 crc kubenswrapper[4886]: I0129 17:18:15.519742 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:18:16 crc kubenswrapper[4886]: I0129 17:18:16.082499 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bfhgd"] Jan 29 17:18:16 crc kubenswrapper[4886]: I0129 17:18:16.240243 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfhgd" event={"ID":"3de4fb0c-479a-43eb-bf0e-910c8993247d","Type":"ContainerStarted","Data":"993d036c77151116f0104b5e52ac5de851cc76e020d62a74f76c3bbf77ef5ab3"} Jan 29 17:18:17 crc kubenswrapper[4886]: I0129 17:18:17.253784 4886 generic.go:334] "Generic (PLEG): container finished" podID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerID="7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523" exitCode=0 Jan 29 17:18:17 crc kubenswrapper[4886]: I0129 17:18:17.253863 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfhgd" event={"ID":"3de4fb0c-479a-43eb-bf0e-910c8993247d","Type":"ContainerDied","Data":"7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523"} Jan 29 17:18:17 crc kubenswrapper[4886]: E0129 17:18:17.672594 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:18:17 crc kubenswrapper[4886]: E0129 17:18:17.673016 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddpvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bfhgd_openshift-marketplace(3de4fb0c-479a-43eb-bf0e-910c8993247d): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:18:17 crc kubenswrapper[4886]: E0129 17:18:17.676468 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:18:18 crc kubenswrapper[4886]: E0129 17:18:18.266964 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:18:24 crc kubenswrapper[4886]: I0129 17:18:24.045494 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-n9fr6"] Jan 29 17:18:24 crc kubenswrapper[4886]: I0129 17:18:24.056730 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-n9fr6"] Jan 29 17:18:24 crc kubenswrapper[4886]: I0129 17:18:24.633356 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6c4698-f001-402f-91e3-1e80bc7bf443" path="/var/lib/kubelet/pods/ea6c4698-f001-402f-91e3-1e80bc7bf443/volumes" Jan 29 17:18:25 crc kubenswrapper[4886]: I0129 17:18:25.031711 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6jmdx"] Jan 29 17:18:25 crc kubenswrapper[4886]: I0129 17:18:25.043054 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4e9f-account-create-update-sdhth"] Jan 29 17:18:25 crc kubenswrapper[4886]: I0129 17:18:25.055649 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6jmdx"] Jan 29 17:18:25 crc kubenswrapper[4886]: I0129 17:18:25.066402 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4e9f-account-create-update-sdhth"] Jan 29 17:18:26 crc kubenswrapper[4886]: I0129 17:18:26.040043 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f9c8-account-create-update-hcc42"] Jan 29 17:18:26 crc kubenswrapper[4886]: I0129 17:18:26.050475 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vqrmb"] Jan 29 17:18:26 crc kubenswrapper[4886]: I0129 17:18:26.060316 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f9c8-account-create-update-hcc42"] Jan 29 17:18:26 crc kubenswrapper[4886]: I0129 17:18:26.069624 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vqrmb"] Jan 29 17:18:26 crc kubenswrapper[4886]: I0129 17:18:26.631473 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0abefc39-4eb0-4600-8e11-b5d4af3c11b4" path="/var/lib/kubelet/pods/0abefc39-4eb0-4600-8e11-b5d4af3c11b4/volumes" Jan 29 17:18:26 crc kubenswrapper[4886]: I0129 17:18:26.632934 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8258df8a-fd9a-4546-8ea7-ce4b7f7180bb" path="/var/lib/kubelet/pods/8258df8a-fd9a-4546-8ea7-ce4b7f7180bb/volumes" Jan 29 17:18:26 crc kubenswrapper[4886]: I0129 17:18:26.634188 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0772ac7-3374-4607-a644-f4ac2e1c078a" path="/var/lib/kubelet/pods/d0772ac7-3374-4607-a644-f4ac2e1c078a/volumes" Jan 29 17:18:26 crc kubenswrapper[4886]: I0129 17:18:26.635479 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13e59b2-0b15-4b7f-b158-ea16ec2b5416" path="/var/lib/kubelet/pods/d13e59b2-0b15-4b7f-b158-ea16ec2b5416/volumes" Jan 29 17:18:27 crc kubenswrapper[4886]: I0129 17:18:27.039653 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cc0e-account-create-update-nxk7k"] Jan 29 17:18:27 crc kubenswrapper[4886]: I0129 17:18:27.053471 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cc0e-account-create-update-nxk7k"] Jan 29 17:18:28 crc kubenswrapper[4886]: I0129 17:18:28.628464 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af00928-6484-4071-b739-bc211ac220ef" path="/var/lib/kubelet/pods/6af00928-6484-4071-b739-bc211ac220ef/volumes" Jan 29 17:18:31 crc kubenswrapper[4886]: E0129 17:18:31.783277 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:18:31 crc kubenswrapper[4886]: E0129 17:18:31.783958 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddpvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bfhgd_openshift-marketplace(3de4fb0c-479a-43eb-bf0e-910c8993247d): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:18:31 crc kubenswrapper[4886]: E0129 17:18:31.785198 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:18:43 crc kubenswrapper[4886]: E0129 17:18:43.643656 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:18:53 crc kubenswrapper[4886]: I0129 17:18:53.847113 4886 scope.go:117] "RemoveContainer" containerID="55979afc492dd3730aa23e20e090c57835e6091af47e18bbcd87fee5afa8dde9" Jan 29 17:18:53 crc kubenswrapper[4886]: I0129 17:18:53.874058 4886 scope.go:117] "RemoveContainer" containerID="c4ce1f7996acaa4140e3f499ede2bc0c80a3f2eb7c1df999e0b4f5903e1d75cf" Jan 29 17:18:53 crc kubenswrapper[4886]: I0129 17:18:53.983639 4886 scope.go:117] "RemoveContainer" containerID="b398660f408eb077ec37e46aac34f95a01068c141577a940f5d64dfc4dc0b027" Jan 29 17:18:54 crc kubenswrapper[4886]: I0129 17:18:54.026043 4886 scope.go:117] "RemoveContainer" containerID="e03fdcc391c686ad6f7c447bf2012b345cc1a12adaddfc3b0b7fbabe7adbed61" Jan 29 17:18:54 crc kubenswrapper[4886]: I0129 17:18:54.078997 4886 scope.go:117] "RemoveContainer" containerID="e75acdd55522e91761ce2d771dbc17900e4f53d297811cf9623f07bc70ba7052" Jan 29 17:18:54 crc kubenswrapper[4886]: I0129 17:18:54.126780 4886 scope.go:117] "RemoveContainer" containerID="8cff761f0cac80358e499809ffa647d36a191c7af1a493dc00f71f33ae4223f1" Jan 29 17:18:54 crc kubenswrapper[4886]: I0129 17:18:54.172713 4886 scope.go:117] "RemoveContainer" containerID="92b4d1b2f475024d893ea29a83366ecc7f80ef2e9282821adbce174622472058" Jan 29 17:18:58 crc kubenswrapper[4886]: E0129 17:18:58.867103 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:18:58 crc kubenswrapper[4886]: E0129 17:18:58.867751 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddpvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bfhgd_openshift-marketplace(3de4fb0c-479a-43eb-bf0e-910c8993247d): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:18:58 crc kubenswrapper[4886]: E0129 17:18:58.869754 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:19:01 crc kubenswrapper[4886]: I0129 17:19:01.057499 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c4q4z"] Jan 29 17:19:01 crc kubenswrapper[4886]: I0129 17:19:01.071611 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c4q4z"] Jan 29 17:19:02 crc kubenswrapper[4886]: I0129 17:19:02.629374 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c467eb7e-a553-4fc5-b366-607a30fe18dd" path="/var/lib/kubelet/pods/c467eb7e-a553-4fc5-b366-607a30fe18dd/volumes" Jan 29 17:19:11 crc kubenswrapper[4886]: E0129 17:19:11.617765 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:19:25 crc kubenswrapper[4886]: I0129 17:19:25.056006 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-60d5-account-create-update-w67hv"] Jan 29 17:19:25 crc kubenswrapper[4886]: I0129 17:19:25.081354 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-60d5-account-create-update-w67hv"] Jan 29 17:19:26 crc kubenswrapper[4886]: I0129 17:19:26.049206 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-6zh6p"] Jan 29 17:19:26 crc kubenswrapper[4886]: I0129 17:19:26.061470 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-6zh6p"] Jan 29 17:19:26 crc kubenswrapper[4886]: E0129 17:19:26.619458 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:19:26 crc kubenswrapper[4886]: I0129 17:19:26.689638 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="323a490d-33e2-4411-8a77-c578f409ba28" path="/var/lib/kubelet/pods/323a490d-33e2-4411-8a77-c578f409ba28/volumes" Jan 29 17:19:26 crc kubenswrapper[4886]: I0129 17:19:26.690907 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec6f2462-b78d-4619-9704-5cc67ae60974" path="/var/lib/kubelet/pods/ec6f2462-b78d-4619-9704-5cc67ae60974/volumes" Jan 29 17:19:28 crc kubenswrapper[4886]: I0129 17:19:28.030432 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-tqcf4"] Jan 29 17:19:28 crc kubenswrapper[4886]: I0129 17:19:28.046068 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-tqcf4"] Jan 29 17:19:28 crc kubenswrapper[4886]: I0129 17:19:28.633110 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cabf586-398a-45a9-80d6-2fd63d9e14e5" path="/var/lib/kubelet/pods/8cabf586-398a-45a9-80d6-2fd63d9e14e5/volumes" Jan 29 17:19:30 crc kubenswrapper[4886]: I0129 17:19:30.037520 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fznz7"] Jan 29 17:19:30 crc kubenswrapper[4886]: I0129 17:19:30.049680 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fznz7"] Jan 29 17:19:30 crc kubenswrapper[4886]: I0129 17:19:30.633286 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a88a08b7-d54a-4414-b7f6-b490949d6b70" path="/var/lib/kubelet/pods/a88a08b7-d54a-4414-b7f6-b490949d6b70/volumes" Jan 29 17:19:39 crc kubenswrapper[4886]: E0129 17:19:39.750751 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:19:39 crc kubenswrapper[4886]: E0129 17:19:39.751420 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddpvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bfhgd_openshift-marketplace(3de4fb0c-479a-43eb-bf0e-910c8993247d): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:19:39 crc kubenswrapper[4886]: E0129 17:19:39.753043 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:19:53 crc kubenswrapper[4886]: E0129 17:19:53.618503 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:19:54 crc kubenswrapper[4886]: I0129 17:19:54.345308 4886 scope.go:117] "RemoveContainer" containerID="b0c7be4a8a6f220b0bc62ecd7ce7d07cb8b17e5644962c70a9a466af1717c6ce" Jan 29 17:19:54 crc kubenswrapper[4886]: I0129 17:19:54.376768 4886 scope.go:117] "RemoveContainer" containerID="94c431dc7f3dd6c3f091efc6b5f4191b950083388e1ef0390fd70fcd7a85128c" Jan 29 17:19:54 crc kubenswrapper[4886]: I0129 17:19:54.443480 4886 scope.go:117] "RemoveContainer" containerID="2e1c0eadae73024c2cb0f70a58a6f4f7d1a81518c1e179c7358b1ee70d254152" Jan 29 17:19:54 crc kubenswrapper[4886]: I0129 17:19:54.501727 4886 scope.go:117] "RemoveContainer" containerID="b316bbc4bed9ea6d21a1f48ac1daf91a604e958e8664a1c95a0d70b2476abcfa" Jan 29 17:19:54 crc kubenswrapper[4886]: I0129 17:19:54.583672 4886 scope.go:117] "RemoveContainer" containerID="d6960d602147a760f370e0aaeba322f8c53999b050075e5ef6c33ecafc0b7928" Jan 29 17:19:59 crc kubenswrapper[4886]: I0129 17:19:59.660808 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:19:59 crc kubenswrapper[4886]: I0129 17:19:59.661818 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:20:04 crc kubenswrapper[4886]: E0129 17:20:04.620272 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:20:12 crc kubenswrapper[4886]: I0129 17:20:12.048085 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ddfqz"] Jan 29 17:20:12 crc kubenswrapper[4886]: I0129 17:20:12.060117 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ddfqz"] Jan 29 17:20:12 crc kubenswrapper[4886]: I0129 17:20:12.629647 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1c51cd-f91d-406b-815c-00879a9d6401" path="/var/lib/kubelet/pods/7a1c51cd-f91d-406b-815c-00879a9d6401/volumes" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.674473 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vlgkv"] Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.677196 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.691961 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlgkv"] Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.748150 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6n2h\" (UniqueName: \"kubernetes.io/projected/75397189-e390-4b5d-bb9d-3017be63794e-kube-api-access-g6n2h\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.748480 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-catalog-content\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.749349 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-utilities\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.852941 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-catalog-content\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.853261 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-utilities\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.853381 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6n2h\" (UniqueName: \"kubernetes.io/projected/75397189-e390-4b5d-bb9d-3017be63794e-kube-api-access-g6n2h\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.853872 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-catalog-content\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.853985 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-utilities\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:14 crc kubenswrapper[4886]: I0129 17:20:14.874119 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6n2h\" (UniqueName: \"kubernetes.io/projected/75397189-e390-4b5d-bb9d-3017be63794e-kube-api-access-g6n2h\") pod \"community-operators-vlgkv\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:15 crc kubenswrapper[4886]: I0129 17:20:15.004706 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:20:15 crc kubenswrapper[4886]: I0129 17:20:15.592745 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlgkv"] Jan 29 17:20:15 crc kubenswrapper[4886]: E0129 17:20:15.625154 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:20:16 crc kubenswrapper[4886]: I0129 17:20:16.556879 4886 generic.go:334] "Generic (PLEG): container finished" podID="75397189-e390-4b5d-bb9d-3017be63794e" containerID="50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c" exitCode=0 Jan 29 17:20:16 crc kubenswrapper[4886]: I0129 17:20:16.556997 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlgkv" event={"ID":"75397189-e390-4b5d-bb9d-3017be63794e","Type":"ContainerDied","Data":"50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c"} Jan 29 17:20:16 crc kubenswrapper[4886]: I0129 17:20:16.557248 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlgkv" event={"ID":"75397189-e390-4b5d-bb9d-3017be63794e","Type":"ContainerStarted","Data":"3014185eacc0527fb4588d33782092cb9980b118f2b2053ba0af25fe3485682c"} Jan 29 17:20:16 crc kubenswrapper[4886]: E0129 17:20:16.695075 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 17:20:16 crc kubenswrapper[4886]: E0129 17:20:16.695509 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6n2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vlgkv_openshift-marketplace(75397189-e390-4b5d-bb9d-3017be63794e): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:20:16 crc kubenswrapper[4886]: E0129 17:20:16.696754 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" Jan 29 17:20:17 crc kubenswrapper[4886]: E0129 17:20:17.572408 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" Jan 29 17:20:27 crc kubenswrapper[4886]: E0129 17:20:27.617832 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:20:29 crc kubenswrapper[4886]: I0129 17:20:29.661071 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:20:29 crc kubenswrapper[4886]: I0129 17:20:29.661594 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:20:30 crc kubenswrapper[4886]: E0129 17:20:30.768243 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 17:20:30 crc kubenswrapper[4886]: E0129 17:20:30.768701 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6n2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vlgkv_openshift-marketplace(75397189-e390-4b5d-bb9d-3017be63794e): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:20:30 crc kubenswrapper[4886]: E0129 17:20:30.769881 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" Jan 29 17:20:41 crc kubenswrapper[4886]: E0129 17:20:41.618127 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:20:45 crc kubenswrapper[4886]: E0129 17:20:45.618487 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" Jan 29 17:20:52 crc kubenswrapper[4886]: E0129 17:20:52.618387 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" Jan 29 17:20:54 crc kubenswrapper[4886]: I0129 17:20:54.738729 4886 scope.go:117] "RemoveContainer" containerID="5be86521758fe7c03f20fd8b758e10774f421701b95693128fa47b2a2e5adc70" Jan 29 17:20:56 crc kubenswrapper[4886]: I0129 17:20:56.617300 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:20:56 crc kubenswrapper[4886]: E0129 17:20:56.746594 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 17:20:56 crc kubenswrapper[4886]: E0129 17:20:56.747705 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6n2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vlgkv_openshift-marketplace(75397189-e390-4b5d-bb9d-3017be63794e): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:20:56 crc kubenswrapper[4886]: E0129 17:20:56.749298 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" Jan 29 17:20:59 crc kubenswrapper[4886]: I0129 17:20:59.660660 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:20:59 crc kubenswrapper[4886]: I0129 17:20:59.661197 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:20:59 crc kubenswrapper[4886]: I0129 17:20:59.661241 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:20:59 crc kubenswrapper[4886]: I0129 17:20:59.662162 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd2f023886beead4933eaa92185559b0b9421864121dccb5c51a6c3ddd9cce35"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:20:59 crc kubenswrapper[4886]: I0129 17:20:59.662223 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://bd2f023886beead4933eaa92185559b0b9421864121dccb5c51a6c3ddd9cce35" gracePeriod=600 Jan 29 17:21:00 crc kubenswrapper[4886]: I0129 17:21:00.053436 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="bd2f023886beead4933eaa92185559b0b9421864121dccb5c51a6c3ddd9cce35" exitCode=0 Jan 29 17:21:00 crc kubenswrapper[4886]: I0129 17:21:00.053512 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"bd2f023886beead4933eaa92185559b0b9421864121dccb5c51a6c3ddd9cce35"} Jan 29 17:21:00 crc kubenswrapper[4886]: I0129 17:21:00.053861 4886 scope.go:117] "RemoveContainer" containerID="37523dcabcb104a05e3a585e6aacd7a7633efd02b8c8e5f7dd95e23d0d43f05d" Jan 29 17:21:01 crc kubenswrapper[4886]: I0129 17:21:01.069186 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b"} Jan 29 17:21:06 crc kubenswrapper[4886]: I0129 17:21:06.123943 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfhgd" event={"ID":"3de4fb0c-479a-43eb-bf0e-910c8993247d","Type":"ContainerStarted","Data":"23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17"} Jan 29 17:21:07 crc kubenswrapper[4886]: I0129 17:21:07.142734 4886 generic.go:334] "Generic (PLEG): container finished" podID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerID="23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17" exitCode=0 Jan 29 17:21:07 crc kubenswrapper[4886]: I0129 17:21:07.142846 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfhgd" event={"ID":"3de4fb0c-479a-43eb-bf0e-910c8993247d","Type":"ContainerDied","Data":"23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17"} Jan 29 17:21:08 crc kubenswrapper[4886]: I0129 17:21:08.156027 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfhgd" event={"ID":"3de4fb0c-479a-43eb-bf0e-910c8993247d","Type":"ContainerStarted","Data":"acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514"} Jan 29 17:21:08 crc kubenswrapper[4886]: I0129 17:21:08.187977 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bfhgd" podStartSLOduration=2.903105783 podStartE2EDuration="2m53.187923943s" podCreationTimestamp="2026-01-29 17:18:15 +0000 UTC" firstStartedPulling="2026-01-29 17:18:17.256204503 +0000 UTC m=+3380.164923795" lastFinishedPulling="2026-01-29 17:21:07.541022683 +0000 UTC m=+3550.449741955" observedRunningTime="2026-01-29 17:21:08.180529292 +0000 UTC m=+3551.089248584" watchObservedRunningTime="2026-01-29 17:21:08.187923943 +0000 UTC m=+3551.096643225" Jan 29 17:21:09 crc kubenswrapper[4886]: E0129 17:21:09.616476 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" Jan 29 17:21:15 crc kubenswrapper[4886]: I0129 17:21:15.520373 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:21:15 crc kubenswrapper[4886]: I0129 17:21:15.521053 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:21:15 crc kubenswrapper[4886]: I0129 17:21:15.565744 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:21:16 crc kubenswrapper[4886]: I0129 17:21:16.278178 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:21:16 crc kubenswrapper[4886]: I0129 17:21:16.341296 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bfhgd"] Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.246779 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bfhgd" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerName="registry-server" containerID="cri-o://acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514" gracePeriod=2 Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.806479 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.902299 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-utilities\") pod \"3de4fb0c-479a-43eb-bf0e-910c8993247d\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.902517 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-catalog-content\") pod \"3de4fb0c-479a-43eb-bf0e-910c8993247d\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.902629 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddpvb\" (UniqueName: \"kubernetes.io/projected/3de4fb0c-479a-43eb-bf0e-910c8993247d-kube-api-access-ddpvb\") pod \"3de4fb0c-479a-43eb-bf0e-910c8993247d\" (UID: \"3de4fb0c-479a-43eb-bf0e-910c8993247d\") " Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.903253 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-utilities" (OuterVolumeSpecName: "utilities") pod "3de4fb0c-479a-43eb-bf0e-910c8993247d" (UID: "3de4fb0c-479a-43eb-bf0e-910c8993247d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.903607 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.909090 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de4fb0c-479a-43eb-bf0e-910c8993247d-kube-api-access-ddpvb" (OuterVolumeSpecName: "kube-api-access-ddpvb") pod "3de4fb0c-479a-43eb-bf0e-910c8993247d" (UID: "3de4fb0c-479a-43eb-bf0e-910c8993247d"). InnerVolumeSpecName "kube-api-access-ddpvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:21:18 crc kubenswrapper[4886]: I0129 17:21:18.965570 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3de4fb0c-479a-43eb-bf0e-910c8993247d" (UID: "3de4fb0c-479a-43eb-bf0e-910c8993247d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.006263 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de4fb0c-479a-43eb-bf0e-910c8993247d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.006551 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddpvb\" (UniqueName: \"kubernetes.io/projected/3de4fb0c-479a-43eb-bf0e-910c8993247d-kube-api-access-ddpvb\") on node \"crc\" DevicePath \"\"" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.266883 4886 generic.go:334] "Generic (PLEG): container finished" podID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerID="acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514" exitCode=0 Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.266944 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfhgd" event={"ID":"3de4fb0c-479a-43eb-bf0e-910c8993247d","Type":"ContainerDied","Data":"acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514"} Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.266993 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfhgd" event={"ID":"3de4fb0c-479a-43eb-bf0e-910c8993247d","Type":"ContainerDied","Data":"993d036c77151116f0104b5e52ac5de851cc76e020d62a74f76c3bbf77ef5ab3"} Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.267022 4886 scope.go:117] "RemoveContainer" containerID="acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.268387 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bfhgd" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.300742 4886 scope.go:117] "RemoveContainer" containerID="23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.336342 4886 scope.go:117] "RemoveContainer" containerID="7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.340728 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bfhgd"] Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.367221 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bfhgd"] Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.393824 4886 scope.go:117] "RemoveContainer" containerID="acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514" Jan 29 17:21:19 crc kubenswrapper[4886]: E0129 17:21:19.394415 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514\": container with ID starting with acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514 not found: ID does not exist" containerID="acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.394455 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514"} err="failed to get container status \"acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514\": rpc error: code = NotFound desc = could not find container \"acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514\": container with ID starting with acb662959a37da402dea77491374e233bfdd0a622e4f08294b5de2e093497514 not found: ID does not exist" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.394484 4886 scope.go:117] "RemoveContainer" containerID="23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17" Jan 29 17:21:19 crc kubenswrapper[4886]: E0129 17:21:19.394773 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17\": container with ID starting with 23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17 not found: ID does not exist" containerID="23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.394805 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17"} err="failed to get container status \"23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17\": rpc error: code = NotFound desc = could not find container \"23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17\": container with ID starting with 23137d1dfd07eb4543914832d6fbec9b81563f5df4b7e520d96f009aed078d17 not found: ID does not exist" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.394824 4886 scope.go:117] "RemoveContainer" containerID="7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523" Jan 29 17:21:19 crc kubenswrapper[4886]: E0129 17:21:19.395112 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523\": container with ID starting with 7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523 not found: ID does not exist" containerID="7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523" Jan 29 17:21:19 crc kubenswrapper[4886]: I0129 17:21:19.395140 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523"} err="failed to get container status \"7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523\": rpc error: code = NotFound desc = could not find container \"7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523\": container with ID starting with 7b334ee63888db455be0d61b260b626dbcfd228221eee73d28ea7fa18d022523 not found: ID does not exist" Jan 29 17:21:20 crc kubenswrapper[4886]: I0129 17:21:20.631189 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" path="/var/lib/kubelet/pods/3de4fb0c-479a-43eb-bf0e-910c8993247d/volumes" Jan 29 17:21:23 crc kubenswrapper[4886]: E0129 17:21:23.619197 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" Jan 29 17:21:36 crc kubenswrapper[4886]: E0129 17:21:36.619659 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" Jan 29 17:21:49 crc kubenswrapper[4886]: I0129 17:21:49.635764 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlgkv" event={"ID":"75397189-e390-4b5d-bb9d-3017be63794e","Type":"ContainerStarted","Data":"77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3"} Jan 29 17:21:51 crc kubenswrapper[4886]: I0129 17:21:51.679975 4886 generic.go:334] "Generic (PLEG): container finished" podID="75397189-e390-4b5d-bb9d-3017be63794e" containerID="77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3" exitCode=0 Jan 29 17:21:51 crc kubenswrapper[4886]: I0129 17:21:51.680031 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlgkv" event={"ID":"75397189-e390-4b5d-bb9d-3017be63794e","Type":"ContainerDied","Data":"77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3"} Jan 29 17:21:52 crc kubenswrapper[4886]: I0129 17:21:52.697735 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlgkv" event={"ID":"75397189-e390-4b5d-bb9d-3017be63794e","Type":"ContainerStarted","Data":"c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a"} Jan 29 17:21:52 crc kubenswrapper[4886]: I0129 17:21:52.723310 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vlgkv" podStartSLOduration=3.141950495 podStartE2EDuration="1m38.723288463s" podCreationTimestamp="2026-01-29 17:20:14 +0000 UTC" firstStartedPulling="2026-01-29 17:20:16.560126085 +0000 UTC m=+3499.468845387" lastFinishedPulling="2026-01-29 17:21:52.141464083 +0000 UTC m=+3595.050183355" observedRunningTime="2026-01-29 17:21:52.71477829 +0000 UTC m=+3595.623497592" watchObservedRunningTime="2026-01-29 17:21:52.723288463 +0000 UTC m=+3595.632007735" Jan 29 17:21:55 crc kubenswrapper[4886]: I0129 17:21:55.005509 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:21:55 crc kubenswrapper[4886]: I0129 17:21:55.005788 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:21:55 crc kubenswrapper[4886]: I0129 17:21:55.054819 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:22:05 crc kubenswrapper[4886]: I0129 17:22:05.099433 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:22:05 crc kubenswrapper[4886]: I0129 17:22:05.151041 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlgkv"] Jan 29 17:22:05 crc kubenswrapper[4886]: I0129 17:22:05.841097 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vlgkv" podUID="75397189-e390-4b5d-bb9d-3017be63794e" containerName="registry-server" containerID="cri-o://c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a" gracePeriod=2 Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.460550 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.594133 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6n2h\" (UniqueName: \"kubernetes.io/projected/75397189-e390-4b5d-bb9d-3017be63794e-kube-api-access-g6n2h\") pod \"75397189-e390-4b5d-bb9d-3017be63794e\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.594241 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-utilities\") pod \"75397189-e390-4b5d-bb9d-3017be63794e\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.594552 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-catalog-content\") pod \"75397189-e390-4b5d-bb9d-3017be63794e\" (UID: \"75397189-e390-4b5d-bb9d-3017be63794e\") " Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.596002 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-utilities" (OuterVolumeSpecName: "utilities") pod "75397189-e390-4b5d-bb9d-3017be63794e" (UID: "75397189-e390-4b5d-bb9d-3017be63794e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.603703 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75397189-e390-4b5d-bb9d-3017be63794e-kube-api-access-g6n2h" (OuterVolumeSpecName: "kube-api-access-g6n2h") pod "75397189-e390-4b5d-bb9d-3017be63794e" (UID: "75397189-e390-4b5d-bb9d-3017be63794e"). InnerVolumeSpecName "kube-api-access-g6n2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.652028 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75397189-e390-4b5d-bb9d-3017be63794e" (UID: "75397189-e390-4b5d-bb9d-3017be63794e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.698536 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.698580 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6n2h\" (UniqueName: \"kubernetes.io/projected/75397189-e390-4b5d-bb9d-3017be63794e-kube-api-access-g6n2h\") on node \"crc\" DevicePath \"\"" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.698593 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75397189-e390-4b5d-bb9d-3017be63794e-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.881214 4886 generic.go:334] "Generic (PLEG): container finished" podID="75397189-e390-4b5d-bb9d-3017be63794e" containerID="c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a" exitCode=0 Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.881275 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlgkv" event={"ID":"75397189-e390-4b5d-bb9d-3017be63794e","Type":"ContainerDied","Data":"c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a"} Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.881309 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlgkv" event={"ID":"75397189-e390-4b5d-bb9d-3017be63794e","Type":"ContainerDied","Data":"3014185eacc0527fb4588d33782092cb9980b118f2b2053ba0af25fe3485682c"} Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.881349 4886 scope.go:117] "RemoveContainer" containerID="c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.881745 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlgkv" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.908658 4886 scope.go:117] "RemoveContainer" containerID="77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.927840 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlgkv"] Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.936599 4886 scope.go:117] "RemoveContainer" containerID="50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.939904 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vlgkv"] Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.995485 4886 scope.go:117] "RemoveContainer" containerID="c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a" Jan 29 17:22:06 crc kubenswrapper[4886]: E0129 17:22:06.996256 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a\": container with ID starting with c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a not found: ID does not exist" containerID="c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.996303 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a"} err="failed to get container status \"c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a\": rpc error: code = NotFound desc = could not find container \"c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a\": container with ID starting with c5d7748afbf0374cd560d960e67108fce3b5d85a4dc5d8649cbc28214002142a not found: ID does not exist" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.996348 4886 scope.go:117] "RemoveContainer" containerID="77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3" Jan 29 17:22:06 crc kubenswrapper[4886]: E0129 17:22:06.997024 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3\": container with ID starting with 77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3 not found: ID does not exist" containerID="77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.997215 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3"} err="failed to get container status \"77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3\": rpc error: code = NotFound desc = could not find container \"77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3\": container with ID starting with 77e9f26b3d74ceb7b80a6b0256c506671555f76c7435cc01ce88b73443d5caf3 not found: ID does not exist" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.997413 4886 scope.go:117] "RemoveContainer" containerID="50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c" Jan 29 17:22:06 crc kubenswrapper[4886]: E0129 17:22:06.998530 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c\": container with ID starting with 50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c not found: ID does not exist" containerID="50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c" Jan 29 17:22:06 crc kubenswrapper[4886]: I0129 17:22:06.998577 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c"} err="failed to get container status \"50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c\": rpc error: code = NotFound desc = could not find container \"50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c\": container with ID starting with 50e7d409e21eaec1e565da5ff686d38148bb5fcc53234f8118461f6f78ce385c not found: ID does not exist" Jan 29 17:22:08 crc kubenswrapper[4886]: I0129 17:22:08.627240 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75397189-e390-4b5d-bb9d-3017be63794e" path="/var/lib/kubelet/pods/75397189-e390-4b5d-bb9d-3017be63794e/volumes" Jan 29 17:23:17 crc kubenswrapper[4886]: I0129 17:23:17.545686 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-f458794ff-v7p92" podUID="79c81ef9-65c7-4372-9a47-8ed93521eadf" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 29 17:23:29 crc kubenswrapper[4886]: I0129 17:23:29.661268 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:23:29 crc kubenswrapper[4886]: I0129 17:23:29.662070 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:23:59 crc kubenswrapper[4886]: I0129 17:23:59.660654 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:23:59 crc kubenswrapper[4886]: I0129 17:23:59.661247 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:24:29 crc kubenswrapper[4886]: I0129 17:24:29.661084 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:24:29 crc kubenswrapper[4886]: I0129 17:24:29.661823 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:24:29 crc kubenswrapper[4886]: I0129 17:24:29.661903 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:24:29 crc kubenswrapper[4886]: I0129 17:24:29.662902 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:24:29 crc kubenswrapper[4886]: I0129 17:24:29.662987 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" gracePeriod=600 Jan 29 17:24:29 crc kubenswrapper[4886]: E0129 17:24:29.819763 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:24:30 crc kubenswrapper[4886]: I0129 17:24:30.496221 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" exitCode=0 Jan 29 17:24:30 crc kubenswrapper[4886]: I0129 17:24:30.496285 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b"} Jan 29 17:24:30 crc kubenswrapper[4886]: I0129 17:24:30.496398 4886 scope.go:117] "RemoveContainer" containerID="bd2f023886beead4933eaa92185559b0b9421864121dccb5c51a6c3ddd9cce35" Jan 29 17:24:30 crc kubenswrapper[4886]: I0129 17:24:30.497421 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:24:30 crc kubenswrapper[4886]: E0129 17:24:30.497820 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:24:41 crc kubenswrapper[4886]: I0129 17:24:41.615939 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:24:41 crc kubenswrapper[4886]: E0129 17:24:41.617394 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:24:52 crc kubenswrapper[4886]: I0129 17:24:52.615530 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:24:52 crc kubenswrapper[4886]: E0129 17:24:52.617065 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:25:07 crc kubenswrapper[4886]: I0129 17:25:07.615618 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:25:07 crc kubenswrapper[4886]: E0129 17:25:07.616959 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:25:18 crc kubenswrapper[4886]: I0129 17:25:18.631250 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:25:18 crc kubenswrapper[4886]: E0129 17:25:18.632501 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:25:30 crc kubenswrapper[4886]: I0129 17:25:30.616055 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:25:30 crc kubenswrapper[4886]: E0129 17:25:30.617512 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:25:42 crc kubenswrapper[4886]: I0129 17:25:42.616122 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:25:42 crc kubenswrapper[4886]: E0129 17:25:42.617548 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:25:55 crc kubenswrapper[4886]: I0129 17:25:55.615791 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:25:55 crc kubenswrapper[4886]: E0129 17:25:55.616828 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:26:10 crc kubenswrapper[4886]: I0129 17:26:10.616448 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:26:10 crc kubenswrapper[4886]: E0129 17:26:10.617273 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:26:24 crc kubenswrapper[4886]: I0129 17:26:24.616002 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:26:24 crc kubenswrapper[4886]: E0129 17:26:24.616782 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:26:35 crc kubenswrapper[4886]: I0129 17:26:35.616358 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:26:35 crc kubenswrapper[4886]: E0129 17:26:35.618670 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:26:49 crc kubenswrapper[4886]: I0129 17:26:49.615377 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:26:49 crc kubenswrapper[4886]: E0129 17:26:49.616044 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:27:02 crc kubenswrapper[4886]: I0129 17:27:02.614738 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:27:02 crc kubenswrapper[4886]: E0129 17:27:02.615558 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:27:15 crc kubenswrapper[4886]: I0129 17:27:15.615472 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:27:15 crc kubenswrapper[4886]: E0129 17:27:15.618681 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:27:26 crc kubenswrapper[4886]: I0129 17:27:26.618162 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:27:26 crc kubenswrapper[4886]: E0129 17:27:26.620005 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:27:37 crc kubenswrapper[4886]: I0129 17:27:37.618624 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:27:37 crc kubenswrapper[4886]: E0129 17:27:37.620231 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:27:52 crc kubenswrapper[4886]: I0129 17:27:52.615899 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:27:52 crc kubenswrapper[4886]: E0129 17:27:52.616747 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.829697 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cvqft"] Jan 29 17:27:54 crc kubenswrapper[4886]: E0129 17:27:54.830494 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerName="extract-content" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.830506 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerName="extract-content" Jan 29 17:27:54 crc kubenswrapper[4886]: E0129 17:27:54.830531 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerName="extract-utilities" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.830537 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerName="extract-utilities" Jan 29 17:27:54 crc kubenswrapper[4886]: E0129 17:27:54.830548 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75397189-e390-4b5d-bb9d-3017be63794e" containerName="extract-utilities" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.830554 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="75397189-e390-4b5d-bb9d-3017be63794e" containerName="extract-utilities" Jan 29 17:27:54 crc kubenswrapper[4886]: E0129 17:27:54.830566 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75397189-e390-4b5d-bb9d-3017be63794e" containerName="registry-server" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.830572 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="75397189-e390-4b5d-bb9d-3017be63794e" containerName="registry-server" Jan 29 17:27:54 crc kubenswrapper[4886]: E0129 17:27:54.830596 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75397189-e390-4b5d-bb9d-3017be63794e" containerName="extract-content" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.830602 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="75397189-e390-4b5d-bb9d-3017be63794e" containerName="extract-content" Jan 29 17:27:54 crc kubenswrapper[4886]: E0129 17:27:54.830610 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerName="registry-server" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.830615 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerName="registry-server" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.830804 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de4fb0c-479a-43eb-bf0e-910c8993247d" containerName="registry-server" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.830823 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="75397189-e390-4b5d-bb9d-3017be63794e" containerName="registry-server" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.832317 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.854928 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cvqft"] Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.941072 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnp6\" (UniqueName: \"kubernetes.io/projected/bd300ccf-3376-4861-bcae-bf7e7310ab20-kube-api-access-wgnp6\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.941160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-catalog-content\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:54 crc kubenswrapper[4886]: I0129 17:27:54.941193 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-utilities\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:55 crc kubenswrapper[4886]: I0129 17:27:55.043414 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgnp6\" (UniqueName: \"kubernetes.io/projected/bd300ccf-3376-4861-bcae-bf7e7310ab20-kube-api-access-wgnp6\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:55 crc kubenswrapper[4886]: I0129 17:27:55.043536 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-catalog-content\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:55 crc kubenswrapper[4886]: I0129 17:27:55.043593 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-utilities\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:55 crc kubenswrapper[4886]: I0129 17:27:55.044096 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-catalog-content\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:55 crc kubenswrapper[4886]: I0129 17:27:55.044191 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-utilities\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:55 crc kubenswrapper[4886]: I0129 17:27:55.076384 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgnp6\" (UniqueName: \"kubernetes.io/projected/bd300ccf-3376-4861-bcae-bf7e7310ab20-kube-api-access-wgnp6\") pod \"redhat-operators-cvqft\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:55 crc kubenswrapper[4886]: I0129 17:27:55.153301 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:27:55 crc kubenswrapper[4886]: I0129 17:27:55.671789 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cvqft"] Jan 29 17:27:56 crc kubenswrapper[4886]: I0129 17:27:56.251090 4886 generic.go:334] "Generic (PLEG): container finished" podID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerID="2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad" exitCode=0 Jan 29 17:27:56 crc kubenswrapper[4886]: I0129 17:27:56.251138 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvqft" event={"ID":"bd300ccf-3376-4861-bcae-bf7e7310ab20","Type":"ContainerDied","Data":"2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad"} Jan 29 17:27:56 crc kubenswrapper[4886]: I0129 17:27:56.251184 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvqft" event={"ID":"bd300ccf-3376-4861-bcae-bf7e7310ab20","Type":"ContainerStarted","Data":"ac306b50644c0ba93e28d27ffe560102b7472fe91bb542b8c5a074fde1b9d833"} Jan 29 17:27:56 crc kubenswrapper[4886]: I0129 17:27:56.253441 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:27:56 crc kubenswrapper[4886]: E0129 17:27:56.378914 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 17:27:56 crc kubenswrapper[4886]: E0129 17:27:56.379622 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wgnp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cvqft_openshift-marketplace(bd300ccf-3376-4861-bcae-bf7e7310ab20): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:27:56 crc kubenswrapper[4886]: E0129 17:27:56.380942 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-cvqft" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" Jan 29 17:27:57 crc kubenswrapper[4886]: E0129 17:27:57.264474 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cvqft" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" Jan 29 17:28:05 crc kubenswrapper[4886]: I0129 17:28:05.615419 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:28:05 crc kubenswrapper[4886]: E0129 17:28:05.616118 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:28:12 crc kubenswrapper[4886]: I0129 17:28:12.440508 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvqft" event={"ID":"bd300ccf-3376-4861-bcae-bf7e7310ab20","Type":"ContainerStarted","Data":"256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f"} Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.070678 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k4wq2"] Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.073428 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.084166 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4wq2"] Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.120061 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-catalog-content\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.120157 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-utilities\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.120192 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96btq\" (UniqueName: \"kubernetes.io/projected/05cce123-7c5e-4254-b4af-53d0a93b2087-kube-api-access-96btq\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.222341 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-catalog-content\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.222452 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-utilities\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.222491 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96btq\" (UniqueName: \"kubernetes.io/projected/05cce123-7c5e-4254-b4af-53d0a93b2087-kube-api-access-96btq\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.223003 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-catalog-content\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.223027 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-utilities\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.245981 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96btq\" (UniqueName: \"kubernetes.io/projected/05cce123-7c5e-4254-b4af-53d0a93b2087-kube-api-access-96btq\") pod \"redhat-marketplace-k4wq2\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:13 crc kubenswrapper[4886]: I0129 17:28:13.430065 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:14 crc kubenswrapper[4886]: I0129 17:28:14.015274 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4wq2"] Jan 29 17:28:14 crc kubenswrapper[4886]: I0129 17:28:14.459652 4886 generic.go:334] "Generic (PLEG): container finished" podID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerID="842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf" exitCode=0 Jan 29 17:28:14 crc kubenswrapper[4886]: I0129 17:28:14.459713 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4wq2" event={"ID":"05cce123-7c5e-4254-b4af-53d0a93b2087","Type":"ContainerDied","Data":"842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf"} Jan 29 17:28:14 crc kubenswrapper[4886]: I0129 17:28:14.459940 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4wq2" event={"ID":"05cce123-7c5e-4254-b4af-53d0a93b2087","Type":"ContainerStarted","Data":"61a0b584afbf6481ef8bc0dbb3bd55f9512c81522023b6b6c81f7237e81d868f"} Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.474043 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8ddvd"] Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.483955 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.492212 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8ddvd"] Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.594113 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-catalog-content\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.594195 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5qz6\" (UniqueName: \"kubernetes.io/projected/35a75b14-10dc-482f-9b03-be71a8b0bfd4-kube-api-access-s5qz6\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.594473 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-utilities\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.696966 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-catalog-content\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.697033 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5qz6\" (UniqueName: \"kubernetes.io/projected/35a75b14-10dc-482f-9b03-be71a8b0bfd4-kube-api-access-s5qz6\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.697193 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-utilities\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.697753 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-utilities\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.698410 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-catalog-content\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.730581 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5qz6\" (UniqueName: \"kubernetes.io/projected/35a75b14-10dc-482f-9b03-be71a8b0bfd4-kube-api-access-s5qz6\") pod \"certified-operators-8ddvd\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:15 crc kubenswrapper[4886]: I0129 17:28:15.854344 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:16 crc kubenswrapper[4886]: I0129 17:28:16.423121 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8ddvd"] Jan 29 17:28:16 crc kubenswrapper[4886]: I0129 17:28:16.488319 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ddvd" event={"ID":"35a75b14-10dc-482f-9b03-be71a8b0bfd4","Type":"ContainerStarted","Data":"ae8c7924702c370ac40c8e4b953e0e47962503187d56a562a79b94f682fa85c7"} Jan 29 17:28:16 crc kubenswrapper[4886]: I0129 17:28:16.500658 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4wq2" event={"ID":"05cce123-7c5e-4254-b4af-53d0a93b2087","Type":"ContainerStarted","Data":"253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394"} Jan 29 17:28:17 crc kubenswrapper[4886]: I0129 17:28:17.513387 4886 generic.go:334] "Generic (PLEG): container finished" podID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerID="743f0e0c8bd0dfe8ae38c7f2d03a8981e74ea3dba06a6339a6bd917fe57aa8e9" exitCode=0 Jan 29 17:28:17 crc kubenswrapper[4886]: I0129 17:28:17.513464 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ddvd" event={"ID":"35a75b14-10dc-482f-9b03-be71a8b0bfd4","Type":"ContainerDied","Data":"743f0e0c8bd0dfe8ae38c7f2d03a8981e74ea3dba06a6339a6bd917fe57aa8e9"} Jan 29 17:28:17 crc kubenswrapper[4886]: I0129 17:28:17.615744 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:28:17 crc kubenswrapper[4886]: E0129 17:28:17.616209 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:28:18 crc kubenswrapper[4886]: I0129 17:28:18.540958 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ddvd" event={"ID":"35a75b14-10dc-482f-9b03-be71a8b0bfd4","Type":"ContainerStarted","Data":"642ddcf7d24f5ba4de7f4cfe5021d1c82bdda14e5ce39e790d42f342b92ed808"} Jan 29 17:28:18 crc kubenswrapper[4886]: I0129 17:28:18.545205 4886 generic.go:334] "Generic (PLEG): container finished" podID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerID="253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394" exitCode=0 Jan 29 17:28:18 crc kubenswrapper[4886]: I0129 17:28:18.545415 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4wq2" event={"ID":"05cce123-7c5e-4254-b4af-53d0a93b2087","Type":"ContainerDied","Data":"253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394"} Jan 29 17:28:19 crc kubenswrapper[4886]: I0129 17:28:19.562219 4886 generic.go:334] "Generic (PLEG): container finished" podID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerID="256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f" exitCode=0 Jan 29 17:28:19 crc kubenswrapper[4886]: I0129 17:28:19.564214 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvqft" event={"ID":"bd300ccf-3376-4861-bcae-bf7e7310ab20","Type":"ContainerDied","Data":"256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f"} Jan 29 17:28:20 crc kubenswrapper[4886]: I0129 17:28:20.577137 4886 generic.go:334] "Generic (PLEG): container finished" podID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerID="642ddcf7d24f5ba4de7f4cfe5021d1c82bdda14e5ce39e790d42f342b92ed808" exitCode=0 Jan 29 17:28:20 crc kubenswrapper[4886]: I0129 17:28:20.577535 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ddvd" event={"ID":"35a75b14-10dc-482f-9b03-be71a8b0bfd4","Type":"ContainerDied","Data":"642ddcf7d24f5ba4de7f4cfe5021d1c82bdda14e5ce39e790d42f342b92ed808"} Jan 29 17:28:20 crc kubenswrapper[4886]: I0129 17:28:20.585965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4wq2" event={"ID":"05cce123-7c5e-4254-b4af-53d0a93b2087","Type":"ContainerStarted","Data":"3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f"} Jan 29 17:28:20 crc kubenswrapper[4886]: I0129 17:28:20.592476 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvqft" event={"ID":"bd300ccf-3376-4861-bcae-bf7e7310ab20","Type":"ContainerStarted","Data":"4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3"} Jan 29 17:28:20 crc kubenswrapper[4886]: I0129 17:28:20.640005 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cvqft" podStartSLOduration=2.780366371 podStartE2EDuration="26.639982947s" podCreationTimestamp="2026-01-29 17:27:54 +0000 UTC" firstStartedPulling="2026-01-29 17:27:56.253144087 +0000 UTC m=+3959.161863359" lastFinishedPulling="2026-01-29 17:28:20.112760623 +0000 UTC m=+3983.021479935" observedRunningTime="2026-01-29 17:28:20.630438914 +0000 UTC m=+3983.539158226" watchObservedRunningTime="2026-01-29 17:28:20.639982947 +0000 UTC m=+3983.548702219" Jan 29 17:28:20 crc kubenswrapper[4886]: I0129 17:28:20.672093 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k4wq2" podStartSLOduration=3.103068192 podStartE2EDuration="7.672071616s" podCreationTimestamp="2026-01-29 17:28:13 +0000 UTC" firstStartedPulling="2026-01-29 17:28:14.461984219 +0000 UTC m=+3977.370703491" lastFinishedPulling="2026-01-29 17:28:19.030987633 +0000 UTC m=+3981.939706915" observedRunningTime="2026-01-29 17:28:20.651219309 +0000 UTC m=+3983.559938591" watchObservedRunningTime="2026-01-29 17:28:20.672071616 +0000 UTC m=+3983.580790888" Jan 29 17:28:22 crc kubenswrapper[4886]: I0129 17:28:22.630608 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ddvd" event={"ID":"35a75b14-10dc-482f-9b03-be71a8b0bfd4","Type":"ContainerStarted","Data":"eb2c0e2ba022ed5bbe2b78ba5d991d3803db9fdfe00af6d0c6e96716e4b2a750"} Jan 29 17:28:22 crc kubenswrapper[4886]: I0129 17:28:22.668393 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8ddvd" podStartSLOduration=3.716718622 podStartE2EDuration="7.668367332s" podCreationTimestamp="2026-01-29 17:28:15 +0000 UTC" firstStartedPulling="2026-01-29 17:28:17.515946556 +0000 UTC m=+3980.424665838" lastFinishedPulling="2026-01-29 17:28:21.467595276 +0000 UTC m=+3984.376314548" observedRunningTime="2026-01-29 17:28:22.655058691 +0000 UTC m=+3985.563777963" watchObservedRunningTime="2026-01-29 17:28:22.668367332 +0000 UTC m=+3985.577086624" Jan 29 17:28:23 crc kubenswrapper[4886]: I0129 17:28:23.430605 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:23 crc kubenswrapper[4886]: I0129 17:28:23.431820 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:23 crc kubenswrapper[4886]: I0129 17:28:23.503412 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:25 crc kubenswrapper[4886]: I0129 17:28:25.154491 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:28:25 crc kubenswrapper[4886]: I0129 17:28:25.154986 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:28:25 crc kubenswrapper[4886]: I0129 17:28:25.855287 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:25 crc kubenswrapper[4886]: I0129 17:28:25.855361 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:25 crc kubenswrapper[4886]: I0129 17:28:25.900002 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:26 crc kubenswrapper[4886]: I0129 17:28:26.210575 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cvqft" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="registry-server" probeResult="failure" output=< Jan 29 17:28:26 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:28:26 crc kubenswrapper[4886]: > Jan 29 17:28:32 crc kubenswrapper[4886]: I0129 17:28:32.616228 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:28:32 crc kubenswrapper[4886]: E0129 17:28:32.617305 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:28:33 crc kubenswrapper[4886]: I0129 17:28:33.729234 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:33 crc kubenswrapper[4886]: I0129 17:28:33.790001 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4wq2"] Jan 29 17:28:34 crc kubenswrapper[4886]: I0129 17:28:34.744496 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k4wq2" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerName="registry-server" containerID="cri-o://3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f" gracePeriod=2 Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.583080 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.614228 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96btq\" (UniqueName: \"kubernetes.io/projected/05cce123-7c5e-4254-b4af-53d0a93b2087-kube-api-access-96btq\") pod \"05cce123-7c5e-4254-b4af-53d0a93b2087\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.614366 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-catalog-content\") pod \"05cce123-7c5e-4254-b4af-53d0a93b2087\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.614511 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-utilities\") pod \"05cce123-7c5e-4254-b4af-53d0a93b2087\" (UID: \"05cce123-7c5e-4254-b4af-53d0a93b2087\") " Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.615418 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-utilities" (OuterVolumeSpecName: "utilities") pod "05cce123-7c5e-4254-b4af-53d0a93b2087" (UID: "05cce123-7c5e-4254-b4af-53d0a93b2087"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.616533 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.641267 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05cce123-7c5e-4254-b4af-53d0a93b2087" (UID: "05cce123-7c5e-4254-b4af-53d0a93b2087"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.651419 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cce123-7c5e-4254-b4af-53d0a93b2087-kube-api-access-96btq" (OuterVolumeSpecName: "kube-api-access-96btq") pod "05cce123-7c5e-4254-b4af-53d0a93b2087" (UID: "05cce123-7c5e-4254-b4af-53d0a93b2087"). InnerVolumeSpecName "kube-api-access-96btq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.719251 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96btq\" (UniqueName: \"kubernetes.io/projected/05cce123-7c5e-4254-b4af-53d0a93b2087-kube-api-access-96btq\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.719340 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cce123-7c5e-4254-b4af-53d0a93b2087-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.755486 4886 generic.go:334] "Generic (PLEG): container finished" podID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerID="3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f" exitCode=0 Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.755533 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4wq2" event={"ID":"05cce123-7c5e-4254-b4af-53d0a93b2087","Type":"ContainerDied","Data":"3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f"} Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.755537 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4wq2" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.755559 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4wq2" event={"ID":"05cce123-7c5e-4254-b4af-53d0a93b2087","Type":"ContainerDied","Data":"61a0b584afbf6481ef8bc0dbb3bd55f9512c81522023b6b6c81f7237e81d868f"} Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.755579 4886 scope.go:117] "RemoveContainer" containerID="3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.779847 4886 scope.go:117] "RemoveContainer" containerID="253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.803545 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4wq2"] Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.814862 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4wq2"] Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.821413 4886 scope.go:117] "RemoveContainer" containerID="842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.852541 4886 scope.go:117] "RemoveContainer" containerID="3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f" Jan 29 17:28:35 crc kubenswrapper[4886]: E0129 17:28:35.853241 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f\": container with ID starting with 3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f not found: ID does not exist" containerID="3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.853277 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f"} err="failed to get container status \"3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f\": rpc error: code = NotFound desc = could not find container \"3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f\": container with ID starting with 3b75f4763fead7a66bbf159571598e4b767cc99f693fb214dbf9a681b5f9707f not found: ID does not exist" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.853301 4886 scope.go:117] "RemoveContainer" containerID="253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394" Jan 29 17:28:35 crc kubenswrapper[4886]: E0129 17:28:35.853695 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394\": container with ID starting with 253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394 not found: ID does not exist" containerID="253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.853726 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394"} err="failed to get container status \"253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394\": rpc error: code = NotFound desc = could not find container \"253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394\": container with ID starting with 253cc685146e10232d5ab9f70d1ede857a6248476c2faa279c39a0a3b167d394 not found: ID does not exist" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.853748 4886 scope.go:117] "RemoveContainer" containerID="842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf" Jan 29 17:28:35 crc kubenswrapper[4886]: E0129 17:28:35.854053 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf\": container with ID starting with 842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf not found: ID does not exist" containerID="842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.854081 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf"} err="failed to get container status \"842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf\": rpc error: code = NotFound desc = could not find container \"842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf\": container with ID starting with 842f59d01bbe3e85d057d0fd9d33f7e9337664f17faeae195f7a44ef00d411bf not found: ID does not exist" Jan 29 17:28:35 crc kubenswrapper[4886]: I0129 17:28:35.921565 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:36 crc kubenswrapper[4886]: I0129 17:28:36.207059 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cvqft" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="registry-server" probeResult="failure" output=< Jan 29 17:28:36 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:28:36 crc kubenswrapper[4886]: > Jan 29 17:28:36 crc kubenswrapper[4886]: I0129 17:28:36.630441 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" path="/var/lib/kubelet/pods/05cce123-7c5e-4254-b4af-53d0a93b2087/volumes" Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.176805 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8ddvd"] Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.177058 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8ddvd" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerName="registry-server" containerID="cri-o://eb2c0e2ba022ed5bbe2b78ba5d991d3803db9fdfe00af6d0c6e96716e4b2a750" gracePeriod=2 Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.793553 4886 generic.go:334] "Generic (PLEG): container finished" podID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerID="eb2c0e2ba022ed5bbe2b78ba5d991d3803db9fdfe00af6d0c6e96716e4b2a750" exitCode=0 Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.793779 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ddvd" event={"ID":"35a75b14-10dc-482f-9b03-be71a8b0bfd4","Type":"ContainerDied","Data":"eb2c0e2ba022ed5bbe2b78ba5d991d3803db9fdfe00af6d0c6e96716e4b2a750"} Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.794158 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ddvd" event={"ID":"35a75b14-10dc-482f-9b03-be71a8b0bfd4","Type":"ContainerDied","Data":"ae8c7924702c370ac40c8e4b953e0e47962503187d56a562a79b94f682fa85c7"} Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.794180 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae8c7924702c370ac40c8e4b953e0e47962503187d56a562a79b94f682fa85c7" Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.835490 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.900179 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-utilities\") pod \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.900410 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5qz6\" (UniqueName: \"kubernetes.io/projected/35a75b14-10dc-482f-9b03-be71a8b0bfd4-kube-api-access-s5qz6\") pod \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.900586 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-catalog-content\") pod \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\" (UID: \"35a75b14-10dc-482f-9b03-be71a8b0bfd4\") " Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.901134 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-utilities" (OuterVolumeSpecName: "utilities") pod "35a75b14-10dc-482f-9b03-be71a8b0bfd4" (UID: "35a75b14-10dc-482f-9b03-be71a8b0bfd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.901801 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.909036 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a75b14-10dc-482f-9b03-be71a8b0bfd4-kube-api-access-s5qz6" (OuterVolumeSpecName: "kube-api-access-s5qz6") pod "35a75b14-10dc-482f-9b03-be71a8b0bfd4" (UID: "35a75b14-10dc-482f-9b03-be71a8b0bfd4"). InnerVolumeSpecName "kube-api-access-s5qz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:28:38 crc kubenswrapper[4886]: I0129 17:28:38.966495 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35a75b14-10dc-482f-9b03-be71a8b0bfd4" (UID: "35a75b14-10dc-482f-9b03-be71a8b0bfd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:28:39 crc kubenswrapper[4886]: I0129 17:28:39.003889 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5qz6\" (UniqueName: \"kubernetes.io/projected/35a75b14-10dc-482f-9b03-be71a8b0bfd4-kube-api-access-s5qz6\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:39 crc kubenswrapper[4886]: I0129 17:28:39.003937 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a75b14-10dc-482f-9b03-be71a8b0bfd4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:39 crc kubenswrapper[4886]: I0129 17:28:39.802993 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ddvd" Jan 29 17:28:39 crc kubenswrapper[4886]: I0129 17:28:39.842539 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8ddvd"] Jan 29 17:28:39 crc kubenswrapper[4886]: I0129 17:28:39.854416 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8ddvd"] Jan 29 17:28:40 crc kubenswrapper[4886]: E0129 17:28:40.001068 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35a75b14_10dc_482f_9b03_be71a8b0bfd4.slice/crio-ae8c7924702c370ac40c8e4b953e0e47962503187d56a562a79b94f682fa85c7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35a75b14_10dc_482f_9b03_be71a8b0bfd4.slice\": RecentStats: unable to find data in memory cache]" Jan 29 17:28:40 crc kubenswrapper[4886]: I0129 17:28:40.628134 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" path="/var/lib/kubelet/pods/35a75b14-10dc-482f-9b03-be71a8b0bfd4/volumes" Jan 29 17:28:45 crc kubenswrapper[4886]: I0129 17:28:45.235128 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:28:45 crc kubenswrapper[4886]: I0129 17:28:45.288566 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:28:45 crc kubenswrapper[4886]: I0129 17:28:45.474218 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cvqft"] Jan 29 17:28:46 crc kubenswrapper[4886]: I0129 17:28:46.885807 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cvqft" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="registry-server" containerID="cri-o://4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3" gracePeriod=2 Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.550695 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.618372 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.618513 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-catalog-content\") pod \"bd300ccf-3376-4861-bcae-bf7e7310ab20\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.618730 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-utilities\") pod \"bd300ccf-3376-4861-bcae-bf7e7310ab20\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.618771 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgnp6\" (UniqueName: \"kubernetes.io/projected/bd300ccf-3376-4861-bcae-bf7e7310ab20-kube-api-access-wgnp6\") pod \"bd300ccf-3376-4861-bcae-bf7e7310ab20\" (UID: \"bd300ccf-3376-4861-bcae-bf7e7310ab20\") " Jan 29 17:28:47 crc kubenswrapper[4886]: E0129 17:28:47.618858 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.619592 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-utilities" (OuterVolumeSpecName: "utilities") pod "bd300ccf-3376-4861-bcae-bf7e7310ab20" (UID: "bd300ccf-3376-4861-bcae-bf7e7310ab20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.619747 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.627902 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd300ccf-3376-4861-bcae-bf7e7310ab20-kube-api-access-wgnp6" (OuterVolumeSpecName: "kube-api-access-wgnp6") pod "bd300ccf-3376-4861-bcae-bf7e7310ab20" (UID: "bd300ccf-3376-4861-bcae-bf7e7310ab20"). InnerVolumeSpecName "kube-api-access-wgnp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.722456 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgnp6\" (UniqueName: \"kubernetes.io/projected/bd300ccf-3376-4861-bcae-bf7e7310ab20-kube-api-access-wgnp6\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.763028 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd300ccf-3376-4861-bcae-bf7e7310ab20" (UID: "bd300ccf-3376-4861-bcae-bf7e7310ab20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.824766 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd300ccf-3376-4861-bcae-bf7e7310ab20-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.897770 4886 generic.go:334] "Generic (PLEG): container finished" podID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerID="4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3" exitCode=0 Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.897810 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvqft" event={"ID":"bd300ccf-3376-4861-bcae-bf7e7310ab20","Type":"ContainerDied","Data":"4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3"} Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.897838 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvqft" event={"ID":"bd300ccf-3376-4861-bcae-bf7e7310ab20","Type":"ContainerDied","Data":"ac306b50644c0ba93e28d27ffe560102b7472fe91bb542b8c5a074fde1b9d833"} Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.897840 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvqft" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.897854 4886 scope.go:117] "RemoveContainer" containerID="4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.919697 4886 scope.go:117] "RemoveContainer" containerID="256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.933260 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cvqft"] Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.947988 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cvqft"] Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.954461 4886 scope.go:117] "RemoveContainer" containerID="2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad" Jan 29 17:28:47 crc kubenswrapper[4886]: I0129 17:28:47.999448 4886 scope.go:117] "RemoveContainer" containerID="4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3" Jan 29 17:28:47 crc kubenswrapper[4886]: E0129 17:28:47.999920 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3\": container with ID starting with 4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3 not found: ID does not exist" containerID="4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3" Jan 29 17:28:48 crc kubenswrapper[4886]: I0129 17:28:47.999966 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3"} err="failed to get container status \"4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3\": rpc error: code = NotFound desc = could not find container \"4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3\": container with ID starting with 4b5c3a0ee80d0412c7b91331bdd66750c33e8dcec79e419a2bfaa922a8aca1b3 not found: ID does not exist" Jan 29 17:28:48 crc kubenswrapper[4886]: I0129 17:28:47.999987 4886 scope.go:117] "RemoveContainer" containerID="256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f" Jan 29 17:28:48 crc kubenswrapper[4886]: E0129 17:28:48.000819 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f\": container with ID starting with 256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f not found: ID does not exist" containerID="256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f" Jan 29 17:28:48 crc kubenswrapper[4886]: I0129 17:28:48.001047 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f"} err="failed to get container status \"256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f\": rpc error: code = NotFound desc = could not find container \"256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f\": container with ID starting with 256d215b6bc8f6b4dd2d7a096efe29752d09fb6df76226c80283a55975a7751f not found: ID does not exist" Jan 29 17:28:48 crc kubenswrapper[4886]: I0129 17:28:48.001089 4886 scope.go:117] "RemoveContainer" containerID="2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad" Jan 29 17:28:48 crc kubenswrapper[4886]: E0129 17:28:48.001679 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad\": container with ID starting with 2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad not found: ID does not exist" containerID="2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad" Jan 29 17:28:48 crc kubenswrapper[4886]: I0129 17:28:48.001707 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad"} err="failed to get container status \"2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad\": rpc error: code = NotFound desc = could not find container \"2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad\": container with ID starting with 2946b5a7224cce9e100a708a0973e21f7be5d0a36fb81ad34a298fee9b955dad not found: ID does not exist" Jan 29 17:28:48 crc kubenswrapper[4886]: I0129 17:28:48.630881 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" path="/var/lib/kubelet/pods/bd300ccf-3376-4861-bcae-bf7e7310ab20/volumes" Jan 29 17:29:00 crc kubenswrapper[4886]: I0129 17:29:00.616229 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:29:00 crc kubenswrapper[4886]: E0129 17:29:00.617040 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:29:15 crc kubenswrapper[4886]: I0129 17:29:15.615886 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:29:15 crc kubenswrapper[4886]: E0129 17:29:15.616731 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:29:27 crc kubenswrapper[4886]: I0129 17:29:27.615667 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:29:27 crc kubenswrapper[4886]: E0129 17:29:27.616608 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:29:39 crc kubenswrapper[4886]: I0129 17:29:39.614747 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:29:40 crc kubenswrapper[4886]: I0129 17:29:40.543355 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"35b339594f7204cb48b198eeee2a9559b017a0c55878601a4de933a78b8a5a91"} Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.198821 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55"] Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200002 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerName="extract-content" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200019 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerName="extract-content" Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200042 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerName="extract-utilities" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200050 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerName="extract-utilities" Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200064 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="extract-utilities" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200073 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="extract-utilities" Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200092 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="extract-content" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200099 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="extract-content" Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200118 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerName="extract-content" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200126 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerName="extract-content" Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200141 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerName="extract-utilities" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200148 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerName="extract-utilities" Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200166 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200173 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200188 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200195 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: E0129 17:30:00.200213 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200219 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200486 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cce123-7c5e-4254-b4af-53d0a93b2087" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200518 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd300ccf-3376-4861-bcae-bf7e7310ab20" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.200550 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a75b14-10dc-482f-9b03-be71a8b0bfd4" containerName="registry-server" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.201534 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.211476 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.212702 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.214123 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55"] Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.301934 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-secret-volume\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.302118 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq48m\" (UniqueName: \"kubernetes.io/projected/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-kube-api-access-nq48m\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.302239 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-config-volume\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.404944 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-secret-volume\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.405072 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq48m\" (UniqueName: \"kubernetes.io/projected/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-kube-api-access-nq48m\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.405227 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-config-volume\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.406481 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-config-volume\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.413764 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-secret-volume\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.424872 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq48m\" (UniqueName: \"kubernetes.io/projected/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-kube-api-access-nq48m\") pod \"collect-profiles-29495130-cdv55\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:00 crc kubenswrapper[4886]: I0129 17:30:00.544389 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:01 crc kubenswrapper[4886]: I0129 17:30:01.093192 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55"] Jan 29 17:30:01 crc kubenswrapper[4886]: I0129 17:30:01.810403 4886 generic.go:334] "Generic (PLEG): container finished" podID="d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281" containerID="e970dea6a6e8251fa9ff24484a3f5ffaee4ce0d2fad251a5d786e848db7373be" exitCode=0 Jan 29 17:30:01 crc kubenswrapper[4886]: I0129 17:30:01.810449 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" event={"ID":"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281","Type":"ContainerDied","Data":"e970dea6a6e8251fa9ff24484a3f5ffaee4ce0d2fad251a5d786e848db7373be"} Jan 29 17:30:01 crc kubenswrapper[4886]: I0129 17:30:01.810790 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" event={"ID":"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281","Type":"ContainerStarted","Data":"9c259f168053361e77ed1c13731247842c1d5e0113f74a4d43cf2793fdb0de05"} Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.296980 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.425455 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-config-volume\") pod \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.425725 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq48m\" (UniqueName: \"kubernetes.io/projected/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-kube-api-access-nq48m\") pod \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.425856 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-secret-volume\") pod \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\" (UID: \"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281\") " Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.426923 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-config-volume" (OuterVolumeSpecName: "config-volume") pod "d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281" (UID: "d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.431995 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281" (UID: "d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.432608 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-kube-api-access-nq48m" (OuterVolumeSpecName: "kube-api-access-nq48m") pod "d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281" (UID: "d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281"). InnerVolumeSpecName "kube-api-access-nq48m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.528684 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.528732 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq48m\" (UniqueName: \"kubernetes.io/projected/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-kube-api-access-nq48m\") on node \"crc\" DevicePath \"\"" Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.528748 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.840183 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" event={"ID":"d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281","Type":"ContainerDied","Data":"9c259f168053361e77ed1c13731247842c1d5e0113f74a4d43cf2793fdb0de05"} Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.840228 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c259f168053361e77ed1c13731247842c1d5e0113f74a4d43cf2793fdb0de05" Jan 29 17:30:03 crc kubenswrapper[4886]: I0129 17:30:03.840300 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55" Jan 29 17:30:04 crc kubenswrapper[4886]: I0129 17:30:04.416346 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr"] Jan 29 17:30:04 crc kubenswrapper[4886]: I0129 17:30:04.429511 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495085-rzdqr"] Jan 29 17:30:04 crc kubenswrapper[4886]: I0129 17:30:04.640934 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a04871a-41ba-40fc-bfb0-ca8f308e9b01" path="/var/lib/kubelet/pods/0a04871a-41ba-40fc-bfb0-ca8f308e9b01/volumes" Jan 29 17:30:55 crc kubenswrapper[4886]: I0129 17:30:55.056388 4886 scope.go:117] "RemoveContainer" containerID="11c1455f9476b08d8f802dd75f2ecc6d25f6377ab593571ce7bee30aa00fa339" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.119412 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vs8cn"] Jan 29 17:31:56 crc kubenswrapper[4886]: E0129 17:31:56.120667 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281" containerName="collect-profiles" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.120682 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281" containerName="collect-profiles" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.120998 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281" containerName="collect-profiles" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.123259 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.137188 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vs8cn"] Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.234558 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-catalog-content\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.234747 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-utilities\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.234847 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flvwr\" (UniqueName: \"kubernetes.io/projected/b15eadb0-03e5-432e-a2e4-3366698223ab-kube-api-access-flvwr\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.337368 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-utilities\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.337488 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flvwr\" (UniqueName: \"kubernetes.io/projected/b15eadb0-03e5-432e-a2e4-3366698223ab-kube-api-access-flvwr\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.337593 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-catalog-content\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.337938 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-utilities\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.338027 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-catalog-content\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.371024 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flvwr\" (UniqueName: \"kubernetes.io/projected/b15eadb0-03e5-432e-a2e4-3366698223ab-kube-api-access-flvwr\") pod \"community-operators-vs8cn\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:56 crc kubenswrapper[4886]: I0129 17:31:56.442814 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:31:57 crc kubenswrapper[4886]: I0129 17:31:57.053729 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vs8cn"] Jan 29 17:31:57 crc kubenswrapper[4886]: I0129 17:31:57.492605 4886 generic.go:334] "Generic (PLEG): container finished" podID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerID="f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082" exitCode=0 Jan 29 17:31:57 crc kubenswrapper[4886]: I0129 17:31:57.492665 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8cn" event={"ID":"b15eadb0-03e5-432e-a2e4-3366698223ab","Type":"ContainerDied","Data":"f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082"} Jan 29 17:31:57 crc kubenswrapper[4886]: I0129 17:31:57.492717 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8cn" event={"ID":"b15eadb0-03e5-432e-a2e4-3366698223ab","Type":"ContainerStarted","Data":"b4bf1e94f7b9b806b205169e73b11e9ec0ab7cca4c63148827d709cccafc8c46"} Jan 29 17:31:58 crc kubenswrapper[4886]: I0129 17:31:58.505644 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8cn" event={"ID":"b15eadb0-03e5-432e-a2e4-3366698223ab","Type":"ContainerStarted","Data":"eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3"} Jan 29 17:31:59 crc kubenswrapper[4886]: I0129 17:31:59.660975 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:31:59 crc kubenswrapper[4886]: I0129 17:31:59.661406 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:32:00 crc kubenswrapper[4886]: I0129 17:32:00.534419 4886 generic.go:334] "Generic (PLEG): container finished" podID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerID="eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3" exitCode=0 Jan 29 17:32:00 crc kubenswrapper[4886]: I0129 17:32:00.534618 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8cn" event={"ID":"b15eadb0-03e5-432e-a2e4-3366698223ab","Type":"ContainerDied","Data":"eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3"} Jan 29 17:32:01 crc kubenswrapper[4886]: I0129 17:32:01.548095 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8cn" event={"ID":"b15eadb0-03e5-432e-a2e4-3366698223ab","Type":"ContainerStarted","Data":"0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e"} Jan 29 17:32:01 crc kubenswrapper[4886]: I0129 17:32:01.579943 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vs8cn" podStartSLOduration=1.953942748 podStartE2EDuration="5.579918453s" podCreationTimestamp="2026-01-29 17:31:56 +0000 UTC" firstStartedPulling="2026-01-29 17:31:57.494981886 +0000 UTC m=+4200.403701158" lastFinishedPulling="2026-01-29 17:32:01.120957561 +0000 UTC m=+4204.029676863" observedRunningTime="2026-01-29 17:32:01.568767286 +0000 UTC m=+4204.477486598" watchObservedRunningTime="2026-01-29 17:32:01.579918453 +0000 UTC m=+4204.488637735" Jan 29 17:32:06 crc kubenswrapper[4886]: I0129 17:32:06.443847 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:32:06 crc kubenswrapper[4886]: I0129 17:32:06.444454 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:32:06 crc kubenswrapper[4886]: I0129 17:32:06.518040 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:32:06 crc kubenswrapper[4886]: I0129 17:32:06.650154 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:32:06 crc kubenswrapper[4886]: I0129 17:32:06.755281 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vs8cn"] Jan 29 17:32:08 crc kubenswrapper[4886]: I0129 17:32:08.630436 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vs8cn" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerName="registry-server" containerID="cri-o://0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e" gracePeriod=2 Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.225870 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.289570 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-catalog-content\") pod \"b15eadb0-03e5-432e-a2e4-3366698223ab\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.289858 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-utilities\") pod \"b15eadb0-03e5-432e-a2e4-3366698223ab\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.290023 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flvwr\" (UniqueName: \"kubernetes.io/projected/b15eadb0-03e5-432e-a2e4-3366698223ab-kube-api-access-flvwr\") pod \"b15eadb0-03e5-432e-a2e4-3366698223ab\" (UID: \"b15eadb0-03e5-432e-a2e4-3366698223ab\") " Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.290663 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-utilities" (OuterVolumeSpecName: "utilities") pod "b15eadb0-03e5-432e-a2e4-3366698223ab" (UID: "b15eadb0-03e5-432e-a2e4-3366698223ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.291234 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.297115 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15eadb0-03e5-432e-a2e4-3366698223ab-kube-api-access-flvwr" (OuterVolumeSpecName: "kube-api-access-flvwr") pod "b15eadb0-03e5-432e-a2e4-3366698223ab" (UID: "b15eadb0-03e5-432e-a2e4-3366698223ab"). InnerVolumeSpecName "kube-api-access-flvwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.346323 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b15eadb0-03e5-432e-a2e4-3366698223ab" (UID: "b15eadb0-03e5-432e-a2e4-3366698223ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.393506 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15eadb0-03e5-432e-a2e4-3366698223ab-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.393541 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flvwr\" (UniqueName: \"kubernetes.io/projected/b15eadb0-03e5-432e-a2e4-3366698223ab-kube-api-access-flvwr\") on node \"crc\" DevicePath \"\"" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.641392 4886 generic.go:334] "Generic (PLEG): container finished" podID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerID="0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e" exitCode=0 Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.641481 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8cn" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.641491 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8cn" event={"ID":"b15eadb0-03e5-432e-a2e4-3366698223ab","Type":"ContainerDied","Data":"0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e"} Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.641731 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8cn" event={"ID":"b15eadb0-03e5-432e-a2e4-3366698223ab","Type":"ContainerDied","Data":"b4bf1e94f7b9b806b205169e73b11e9ec0ab7cca4c63148827d709cccafc8c46"} Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.641752 4886 scope.go:117] "RemoveContainer" containerID="0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.688972 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vs8cn"] Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.698765 4886 scope.go:117] "RemoveContainer" containerID="eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.719514 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vs8cn"] Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.731579 4886 scope.go:117] "RemoveContainer" containerID="f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.854792 4886 scope.go:117] "RemoveContainer" containerID="0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e" Jan 29 17:32:09 crc kubenswrapper[4886]: E0129 17:32:09.855146 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e\": container with ID starting with 0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e not found: ID does not exist" containerID="0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.855176 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e"} err="failed to get container status \"0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e\": rpc error: code = NotFound desc = could not find container \"0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e\": container with ID starting with 0df23cbc18a986a7161c7c8329ec1af48c1a7572d4c3627fe49cd88cdfaa2f8e not found: ID does not exist" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.855197 4886 scope.go:117] "RemoveContainer" containerID="eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3" Jan 29 17:32:09 crc kubenswrapper[4886]: E0129 17:32:09.855490 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3\": container with ID starting with eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3 not found: ID does not exist" containerID="eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.855543 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3"} err="failed to get container status \"eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3\": rpc error: code = NotFound desc = could not find container \"eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3\": container with ID starting with eb4c3a9407b847e7eb345a3881082eb392d458c7757612afe4499b64754972a3 not found: ID does not exist" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.855570 4886 scope.go:117] "RemoveContainer" containerID="f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082" Jan 29 17:32:09 crc kubenswrapper[4886]: E0129 17:32:09.855818 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082\": container with ID starting with f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082 not found: ID does not exist" containerID="f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082" Jan 29 17:32:09 crc kubenswrapper[4886]: I0129 17:32:09.855859 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082"} err="failed to get container status \"f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082\": rpc error: code = NotFound desc = could not find container \"f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082\": container with ID starting with f4c918fc85d3407db2aee5b3bc2331867912c942c0d4a6509d35d2d9bbbd8082 not found: ID does not exist" Jan 29 17:32:10 crc kubenswrapper[4886]: I0129 17:32:10.633413 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" path="/var/lib/kubelet/pods/b15eadb0-03e5-432e-a2e4-3366698223ab/volumes" Jan 29 17:32:29 crc kubenswrapper[4886]: I0129 17:32:29.661206 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:32:29 crc kubenswrapper[4886]: I0129 17:32:29.661802 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:32:59 crc kubenswrapper[4886]: I0129 17:32:59.661522 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:32:59 crc kubenswrapper[4886]: I0129 17:32:59.662091 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:32:59 crc kubenswrapper[4886]: I0129 17:32:59.662143 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:32:59 crc kubenswrapper[4886]: I0129 17:32:59.662828 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35b339594f7204cb48b198eeee2a9559b017a0c55878601a4de933a78b8a5a91"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:32:59 crc kubenswrapper[4886]: I0129 17:32:59.662887 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://35b339594f7204cb48b198eeee2a9559b017a0c55878601a4de933a78b8a5a91" gracePeriod=600 Jan 29 17:33:00 crc kubenswrapper[4886]: I0129 17:33:00.210149 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="35b339594f7204cb48b198eeee2a9559b017a0c55878601a4de933a78b8a5a91" exitCode=0 Jan 29 17:33:00 crc kubenswrapper[4886]: I0129 17:33:00.210239 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"35b339594f7204cb48b198eeee2a9559b017a0c55878601a4de933a78b8a5a91"} Jan 29 17:33:00 crc kubenswrapper[4886]: I0129 17:33:00.210533 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a"} Jan 29 17:33:00 crc kubenswrapper[4886]: I0129 17:33:00.210552 4886 scope.go:117] "RemoveContainer" containerID="55efff0568134497b8e6ea81a0b8b1f655f106780275cdcff4518a5bd8ee6d2b" Jan 29 17:34:55 crc kubenswrapper[4886]: I0129 17:34:55.464534 4886 scope.go:117] "RemoveContainer" containerID="642ddcf7d24f5ba4de7f4cfe5021d1c82bdda14e5ce39e790d42f342b92ed808" Jan 29 17:34:55 crc kubenswrapper[4886]: I0129 17:34:55.517230 4886 scope.go:117] "RemoveContainer" containerID="eb2c0e2ba022ed5bbe2b78ba5d991d3803db9fdfe00af6d0c6e96716e4b2a750" Jan 29 17:34:55 crc kubenswrapper[4886]: I0129 17:34:55.588080 4886 scope.go:117] "RemoveContainer" containerID="743f0e0c8bd0dfe8ae38c7f2d03a8981e74ea3dba06a6339a6bd917fe57aa8e9" Jan 29 17:35:29 crc kubenswrapper[4886]: I0129 17:35:29.661179 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:35:29 crc kubenswrapper[4886]: I0129 17:35:29.661856 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:35:59 crc kubenswrapper[4886]: I0129 17:35:59.661742 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:35:59 crc kubenswrapper[4886]: I0129 17:35:59.662509 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.660675 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.661090 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.661131 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.662001 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.662115 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" gracePeriod=600 Jan 29 17:36:29 crc kubenswrapper[4886]: E0129 17:36:29.784751 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.876867 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" exitCode=0 Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.876917 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a"} Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.876955 4886 scope.go:117] "RemoveContainer" containerID="35b339594f7204cb48b198eeee2a9559b017a0c55878601a4de933a78b8a5a91" Jan 29 17:36:29 crc kubenswrapper[4886]: I0129 17:36:29.879058 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:36:29 crc kubenswrapper[4886]: E0129 17:36:29.879890 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:36:42 crc kubenswrapper[4886]: I0129 17:36:42.615656 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:36:42 crc kubenswrapper[4886]: E0129 17:36:42.616491 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:36:54 crc kubenswrapper[4886]: I0129 17:36:54.623105 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:36:54 crc kubenswrapper[4886]: E0129 17:36:54.626531 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:37:07 crc kubenswrapper[4886]: I0129 17:37:07.617920 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:37:07 crc kubenswrapper[4886]: E0129 17:37:07.619179 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:37:18 crc kubenswrapper[4886]: I0129 17:37:18.627768 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:37:18 crc kubenswrapper[4886]: E0129 17:37:18.629084 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:37:31 crc kubenswrapper[4886]: I0129 17:37:31.616923 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:37:31 crc kubenswrapper[4886]: E0129 17:37:31.617664 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:37:42 crc kubenswrapper[4886]: I0129 17:37:42.621682 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:37:42 crc kubenswrapper[4886]: E0129 17:37:42.623232 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:37:56 crc kubenswrapper[4886]: I0129 17:37:56.621131 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:37:56 crc kubenswrapper[4886]: E0129 17:37:56.621924 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:38:07 crc kubenswrapper[4886]: I0129 17:38:07.616313 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:38:07 crc kubenswrapper[4886]: E0129 17:38:07.617551 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:38:19 crc kubenswrapper[4886]: I0129 17:38:19.617078 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:38:19 crc kubenswrapper[4886]: E0129 17:38:19.620919 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:38:31 crc kubenswrapper[4886]: I0129 17:38:31.615367 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:38:31 crc kubenswrapper[4886]: E0129 17:38:31.616195 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:38:43 crc kubenswrapper[4886]: I0129 17:38:43.615857 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:38:43 crc kubenswrapper[4886]: E0129 17:38:43.616650 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.691027 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7bw7c"] Jan 29 17:38:51 crc kubenswrapper[4886]: E0129 17:38:51.692167 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerName="extract-utilities" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.692182 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerName="extract-utilities" Jan 29 17:38:51 crc kubenswrapper[4886]: E0129 17:38:51.692194 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerName="registry-server" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.692199 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerName="registry-server" Jan 29 17:38:51 crc kubenswrapper[4886]: E0129 17:38:51.692216 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerName="extract-content" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.692222 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerName="extract-content" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.692463 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15eadb0-03e5-432e-a2e4-3366698223ab" containerName="registry-server" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.695484 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.715171 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7bw7c"] Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.736991 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d2ph\" (UniqueName: \"kubernetes.io/projected/c566a66d-f66d-457d-80eb-a0cf5bf4e013-kube-api-access-9d2ph\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.737094 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-catalog-content\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.737174 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-utilities\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.839020 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d2ph\" (UniqueName: \"kubernetes.io/projected/c566a66d-f66d-457d-80eb-a0cf5bf4e013-kube-api-access-9d2ph\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.839095 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-catalog-content\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.839152 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-utilities\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.839630 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-utilities\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.839771 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-catalog-content\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:51 crc kubenswrapper[4886]: I0129 17:38:51.867014 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d2ph\" (UniqueName: \"kubernetes.io/projected/c566a66d-f66d-457d-80eb-a0cf5bf4e013-kube-api-access-9d2ph\") pod \"redhat-operators-7bw7c\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:52 crc kubenswrapper[4886]: I0129 17:38:52.033305 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:38:52 crc kubenswrapper[4886]: I0129 17:38:52.568959 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7bw7c"] Jan 29 17:38:52 crc kubenswrapper[4886]: I0129 17:38:52.757145 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bw7c" event={"ID":"c566a66d-f66d-457d-80eb-a0cf5bf4e013","Type":"ContainerStarted","Data":"31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace"} Jan 29 17:38:52 crc kubenswrapper[4886]: I0129 17:38:52.757388 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bw7c" event={"ID":"c566a66d-f66d-457d-80eb-a0cf5bf4e013","Type":"ContainerStarted","Data":"cab69af52cd3a4f3f325f6b78803a593e82fd270c10956a862ec4c1b3df6eb47"} Jan 29 17:38:53 crc kubenswrapper[4886]: I0129 17:38:53.774123 4886 generic.go:334] "Generic (PLEG): container finished" podID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerID="31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace" exitCode=0 Jan 29 17:38:53 crc kubenswrapper[4886]: I0129 17:38:53.774219 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bw7c" event={"ID":"c566a66d-f66d-457d-80eb-a0cf5bf4e013","Type":"ContainerDied","Data":"31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace"} Jan 29 17:38:53 crc kubenswrapper[4886]: I0129 17:38:53.777934 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:38:53 crc kubenswrapper[4886]: E0129 17:38:53.917138 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 17:38:53 crc kubenswrapper[4886]: E0129 17:38:53.917370 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9d2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7bw7c_openshift-marketplace(c566a66d-f66d-457d-80eb-a0cf5bf4e013): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:38:53 crc kubenswrapper[4886]: E0129 17:38:53.918710 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:38:54 crc kubenswrapper[4886]: E0129 17:38:54.795709 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:38:58 crc kubenswrapper[4886]: I0129 17:38:58.624672 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:38:58 crc kubenswrapper[4886]: E0129 17:38:58.625358 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.180023 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sqs8b"] Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.183966 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.194577 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqs8b"] Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.216551 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-catalog-content\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.216706 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4d4z\" (UniqueName: \"kubernetes.io/projected/d8da04de-c293-46ce-aeae-b2081be3c077-kube-api-access-q4d4z\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.216742 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-utilities\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.320304 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-catalog-content\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.320464 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4d4z\" (UniqueName: \"kubernetes.io/projected/d8da04de-c293-46ce-aeae-b2081be3c077-kube-api-access-q4d4z\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.321113 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-utilities\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.321124 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-catalog-content\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.320495 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-utilities\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.346982 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4d4z\" (UniqueName: \"kubernetes.io/projected/d8da04de-c293-46ce-aeae-b2081be3c077-kube-api-access-q4d4z\") pod \"redhat-marketplace-sqs8b\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:04 crc kubenswrapper[4886]: I0129 17:39:04.511649 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:39:05 crc kubenswrapper[4886]: I0129 17:39:05.052304 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqs8b"] Jan 29 17:39:05 crc kubenswrapper[4886]: I0129 17:39:05.945460 4886 generic.go:334] "Generic (PLEG): container finished" podID="d8da04de-c293-46ce-aeae-b2081be3c077" containerID="95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22" exitCode=0 Jan 29 17:39:05 crc kubenswrapper[4886]: I0129 17:39:05.945577 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqs8b" event={"ID":"d8da04de-c293-46ce-aeae-b2081be3c077","Type":"ContainerDied","Data":"95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22"} Jan 29 17:39:05 crc kubenswrapper[4886]: I0129 17:39:05.945731 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqs8b" event={"ID":"d8da04de-c293-46ce-aeae-b2081be3c077","Type":"ContainerStarted","Data":"efefc164eab7dbbf5bc524a94050b180180d68604ff2396211c4fb6aee8d9fad"} Jan 29 17:39:06 crc kubenswrapper[4886]: E0129 17:39:06.090206 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 17:39:06 crc kubenswrapper[4886]: E0129 17:39:06.091065 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4d4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sqs8b_openshift-marketplace(d8da04de-c293-46ce-aeae-b2081be3c077): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:39:06 crc kubenswrapper[4886]: E0129 17:39:06.092449 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:39:06 crc kubenswrapper[4886]: E0129 17:39:06.961981 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:39:09 crc kubenswrapper[4886]: I0129 17:39:09.615510 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:39:09 crc kubenswrapper[4886]: E0129 17:39:09.616583 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:39:09 crc kubenswrapper[4886]: E0129 17:39:09.742816 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 17:39:09 crc kubenswrapper[4886]: E0129 17:39:09.743433 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9d2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7bw7c_openshift-marketplace(c566a66d-f66d-457d-80eb-a0cf5bf4e013): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:39:09 crc kubenswrapper[4886]: E0129 17:39:09.745416 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:39:20 crc kubenswrapper[4886]: E0129 17:39:20.621212 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:39:20 crc kubenswrapper[4886]: E0129 17:39:20.772186 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 17:39:20 crc kubenswrapper[4886]: E0129 17:39:20.772427 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4d4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sqs8b_openshift-marketplace(d8da04de-c293-46ce-aeae-b2081be3c077): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:39:20 crc kubenswrapper[4886]: E0129 17:39:20.774377 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:39:24 crc kubenswrapper[4886]: I0129 17:39:24.615880 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:39:24 crc kubenswrapper[4886]: E0129 17:39:24.617058 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:39:32 crc kubenswrapper[4886]: E0129 17:39:32.768577 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 17:39:32 crc kubenswrapper[4886]: E0129 17:39:32.769257 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9d2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7bw7c_openshift-marketplace(c566a66d-f66d-457d-80eb-a0cf5bf4e013): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:39:32 crc kubenswrapper[4886]: E0129 17:39:32.770534 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:39:34 crc kubenswrapper[4886]: E0129 17:39:34.618895 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:39:39 crc kubenswrapper[4886]: I0129 17:39:39.616158 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:39:39 crc kubenswrapper[4886]: E0129 17:39:39.616787 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:39:45 crc kubenswrapper[4886]: E0129 17:39:45.619067 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:39:49 crc kubenswrapper[4886]: E0129 17:39:49.750289 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 17:39:49 crc kubenswrapper[4886]: E0129 17:39:49.751096 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4d4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sqs8b_openshift-marketplace(d8da04de-c293-46ce-aeae-b2081be3c077): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:39:49 crc kubenswrapper[4886]: E0129 17:39:49.752749 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:39:54 crc kubenswrapper[4886]: I0129 17:39:54.617016 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:39:54 crc kubenswrapper[4886]: E0129 17:39:54.618191 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:39:58 crc kubenswrapper[4886]: E0129 17:39:58.632907 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:40:04 crc kubenswrapper[4886]: E0129 17:40:04.618417 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:40:09 crc kubenswrapper[4886]: I0129 17:40:09.615900 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:40:09 crc kubenswrapper[4886]: E0129 17:40:09.617225 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:40:09 crc kubenswrapper[4886]: E0129 17:40:09.619317 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.507857 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qsjfd"] Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.511485 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.532926 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qsjfd"] Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.619558 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-utilities\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.619780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-catalog-content\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.619935 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlxp8\" (UniqueName: \"kubernetes.io/projected/7ceed770-f253-4044-92f0-c8a07b89b621-kube-api-access-nlxp8\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.722451 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlxp8\" (UniqueName: \"kubernetes.io/projected/7ceed770-f253-4044-92f0-c8a07b89b621-kube-api-access-nlxp8\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.722644 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-utilities\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.722783 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-catalog-content\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.723484 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-catalog-content\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.723926 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-utilities\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.744193 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlxp8\" (UniqueName: \"kubernetes.io/projected/7ceed770-f253-4044-92f0-c8a07b89b621-kube-api-access-nlxp8\") pod \"certified-operators-qsjfd\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:14 crc kubenswrapper[4886]: I0129 17:40:14.838297 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 17:40:15 crc kubenswrapper[4886]: I0129 17:40:15.413090 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qsjfd"] Jan 29 17:40:15 crc kubenswrapper[4886]: I0129 17:40:15.940095 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ceed770-f253-4044-92f0-c8a07b89b621" containerID="bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b" exitCode=0 Jan 29 17:40:15 crc kubenswrapper[4886]: I0129 17:40:15.940166 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsjfd" event={"ID":"7ceed770-f253-4044-92f0-c8a07b89b621","Type":"ContainerDied","Data":"bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b"} Jan 29 17:40:15 crc kubenswrapper[4886]: I0129 17:40:15.940487 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsjfd" event={"ID":"7ceed770-f253-4044-92f0-c8a07b89b621","Type":"ContainerStarted","Data":"fb5b6b721dd0a2050f48ef0e26fac1871e4ba7b7b47b95e41a00c0852ef2c55b"} Jan 29 17:40:16 crc kubenswrapper[4886]: E0129 17:40:16.075802 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:40:16 crc kubenswrapper[4886]: E0129 17:40:16.076005 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlxp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qsjfd_openshift-marketplace(7ceed770-f253-4044-92f0-c8a07b89b621): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:40:16 crc kubenswrapper[4886]: E0129 17:40:16.078041 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:40:16 crc kubenswrapper[4886]: E0129 17:40:16.954143 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:40:18 crc kubenswrapper[4886]: E0129 17:40:18.628021 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:40:21 crc kubenswrapper[4886]: I0129 17:40:21.615791 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:40:21 crc kubenswrapper[4886]: E0129 17:40:21.616374 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:40:22 crc kubenswrapper[4886]: E0129 17:40:22.741714 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 17:40:22 crc kubenswrapper[4886]: E0129 17:40:22.742466 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9d2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7bw7c_openshift-marketplace(c566a66d-f66d-457d-80eb-a0cf5bf4e013): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:40:22 crc kubenswrapper[4886]: E0129 17:40:22.744313 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:40:30 crc kubenswrapper[4886]: E0129 17:40:30.748523 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:40:30 crc kubenswrapper[4886]: E0129 17:40:30.749407 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlxp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qsjfd_openshift-marketplace(7ceed770-f253-4044-92f0-c8a07b89b621): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:40:30 crc kubenswrapper[4886]: E0129 17:40:30.751558 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:40:31 crc kubenswrapper[4886]: E0129 17:40:31.740018 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 17:40:31 crc kubenswrapper[4886]: E0129 17:40:31.740659 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4d4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sqs8b_openshift-marketplace(d8da04de-c293-46ce-aeae-b2081be3c077): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:40:31 crc kubenswrapper[4886]: E0129 17:40:31.741894 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:40:34 crc kubenswrapper[4886]: I0129 17:40:34.616250 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:40:34 crc kubenswrapper[4886]: E0129 17:40:34.617841 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:40:35 crc kubenswrapper[4886]: E0129 17:40:35.618820 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:40:43 crc kubenswrapper[4886]: E0129 17:40:43.619076 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:40:46 crc kubenswrapper[4886]: E0129 17:40:46.620160 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:40:47 crc kubenswrapper[4886]: I0129 17:40:47.616174 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:40:47 crc kubenswrapper[4886]: E0129 17:40:47.616754 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:40:50 crc kubenswrapper[4886]: E0129 17:40:50.621397 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:40:54 crc kubenswrapper[4886]: E0129 17:40:54.754612 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:40:54 crc kubenswrapper[4886]: E0129 17:40:54.755367 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlxp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qsjfd_openshift-marketplace(7ceed770-f253-4044-92f0-c8a07b89b621): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:40:54 crc kubenswrapper[4886]: E0129 17:40:54.756887 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:40:58 crc kubenswrapper[4886]: I0129 17:40:58.615426 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:40:58 crc kubenswrapper[4886]: E0129 17:40:58.616721 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:41:00 crc kubenswrapper[4886]: E0129 17:41:00.618187 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:41:04 crc kubenswrapper[4886]: E0129 17:41:04.619962 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:41:05 crc kubenswrapper[4886]: E0129 17:41:05.617481 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:41:13 crc kubenswrapper[4886]: I0129 17:41:13.615605 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:41:13 crc kubenswrapper[4886]: E0129 17:41:13.616928 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:41:14 crc kubenswrapper[4886]: E0129 17:41:14.620121 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:41:19 crc kubenswrapper[4886]: E0129 17:41:19.618914 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:41:19 crc kubenswrapper[4886]: E0129 17:41:19.619160 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:41:25 crc kubenswrapper[4886]: I0129 17:41:25.615982 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:41:25 crc kubenswrapper[4886]: E0129 17:41:25.617833 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:41:27 crc kubenswrapper[4886]: E0129 17:41:27.620478 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:41:30 crc kubenswrapper[4886]: E0129 17:41:30.619356 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:41:34 crc kubenswrapper[4886]: E0129 17:41:34.620620 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:41:38 crc kubenswrapper[4886]: I0129 17:41:38.628894 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:41:38 crc kubenswrapper[4886]: I0129 17:41:38.950529 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"08c1b8c3edabbeb571f6803cae251f6a7919758b2342154da4b61975a4b2aba4"} Jan 29 17:41:39 crc kubenswrapper[4886]: E0129 17:41:39.617533 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:41:43 crc kubenswrapper[4886]: E0129 17:41:43.750448 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:41:43 crc kubenswrapper[4886]: E0129 17:41:43.751740 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlxp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qsjfd_openshift-marketplace(7ceed770-f253-4044-92f0-c8a07b89b621): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:41:43 crc kubenswrapper[4886]: E0129 17:41:43.752982 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:41:46 crc kubenswrapper[4886]: E0129 17:41:46.753473 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 17:41:46 crc kubenswrapper[4886]: E0129 17:41:46.754698 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9d2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7bw7c_openshift-marketplace(c566a66d-f66d-457d-80eb-a0cf5bf4e013): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:41:46 crc kubenswrapper[4886]: E0129 17:41:46.756066 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:41:54 crc kubenswrapper[4886]: E0129 17:41:54.623888 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:41:54 crc kubenswrapper[4886]: E0129 17:41:54.749934 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 17:41:54 crc kubenswrapper[4886]: E0129 17:41:54.750191 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4d4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sqs8b_openshift-marketplace(d8da04de-c293-46ce-aeae-b2081be3c077): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:41:54 crc kubenswrapper[4886]: E0129 17:41:54.751741 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:41:57 crc kubenswrapper[4886]: E0129 17:41:57.619686 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:41:59 crc kubenswrapper[4886]: I0129 17:41:59.765481 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="954d7d1e-fd92-4c83-87d8-87a1f866dbbe" containerName="galera" probeResult="failure" output="command timed out" Jan 29 17:41:59 crc kubenswrapper[4886]: I0129 17:41:59.766751 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="954d7d1e-fd92-4c83-87d8-87a1f866dbbe" containerName="galera" probeResult="failure" output="command timed out" Jan 29 17:42:05 crc kubenswrapper[4886]: E0129 17:42:05.621632 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:42:07 crc kubenswrapper[4886]: E0129 17:42:07.620220 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:42:08 crc kubenswrapper[4886]: E0129 17:42:08.645401 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:42:18 crc kubenswrapper[4886]: E0129 17:42:18.636450 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:42:19 crc kubenswrapper[4886]: E0129 17:42:19.619829 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:42:20 crc kubenswrapper[4886]: E0129 17:42:20.618504 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:42:32 crc kubenswrapper[4886]: E0129 17:42:32.618811 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:42:32 crc kubenswrapper[4886]: E0129 17:42:32.621599 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:42:34 crc kubenswrapper[4886]: E0129 17:42:34.618007 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:42:43 crc kubenswrapper[4886]: E0129 17:42:43.621551 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:42:45 crc kubenswrapper[4886]: E0129 17:42:45.618322 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:42:46 crc kubenswrapper[4886]: E0129 17:42:46.618865 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:42:54 crc kubenswrapper[4886]: E0129 17:42:54.619762 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:42:57 crc kubenswrapper[4886]: E0129 17:42:57.617498 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:43:00 crc kubenswrapper[4886]: E0129 17:43:00.616582 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:43:06 crc kubenswrapper[4886]: E0129 17:43:06.621045 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:43:12 crc kubenswrapper[4886]: E0129 17:43:12.622415 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:43:12 crc kubenswrapper[4886]: E0129 17:43:12.788594 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:43:12 crc kubenswrapper[4886]: E0129 17:43:12.788773 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlxp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qsjfd_openshift-marketplace(7ceed770-f253-4044-92f0-c8a07b89b621): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:43:12 crc kubenswrapper[4886]: E0129 17:43:12.789989 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:43:17 crc kubenswrapper[4886]: E0129 17:43:17.618390 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:43:23 crc kubenswrapper[4886]: E0129 17:43:23.620913 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:43:25 crc kubenswrapper[4886]: E0129 17:43:25.618890 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:43:31 crc kubenswrapper[4886]: E0129 17:43:31.619082 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:43:36 crc kubenswrapper[4886]: E0129 17:43:36.620294 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:43:38 crc kubenswrapper[4886]: E0129 17:43:38.632221 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:43:42 crc kubenswrapper[4886]: E0129 17:43:42.621894 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:43:47 crc kubenswrapper[4886]: E0129 17:43:47.624855 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:43:51 crc kubenswrapper[4886]: E0129 17:43:51.618635 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:43:54 crc kubenswrapper[4886]: E0129 17:43:54.619366 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:43:59 crc kubenswrapper[4886]: E0129 17:43:59.619372 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:43:59 crc kubenswrapper[4886]: I0129 17:43:59.661733 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:43:59 crc kubenswrapper[4886]: I0129 17:43:59.661818 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:44:04 crc kubenswrapper[4886]: E0129 17:44:04.619979 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:44:06 crc kubenswrapper[4886]: E0129 17:44:06.619560 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:44:11 crc kubenswrapper[4886]: E0129 17:44:11.617930 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:44:15 crc kubenswrapper[4886]: E0129 17:44:15.619168 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:44:19 crc kubenswrapper[4886]: E0129 17:44:19.619235 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:44:22 crc kubenswrapper[4886]: E0129 17:44:22.621765 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:44:29 crc kubenswrapper[4886]: E0129 17:44:29.618680 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:44:29 crc kubenswrapper[4886]: I0129 17:44:29.661727 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:44:29 crc kubenswrapper[4886]: I0129 17:44:29.662158 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:44:31 crc kubenswrapper[4886]: I0129 17:44:31.631355 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:44:31 crc kubenswrapper[4886]: E0129 17:44:31.820492 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 17:44:31 crc kubenswrapper[4886]: E0129 17:44:31.820648 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9d2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7bw7c_openshift-marketplace(c566a66d-f66d-457d-80eb-a0cf5bf4e013): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:44:31 crc kubenswrapper[4886]: E0129 17:44:31.821940 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:44:34 crc kubenswrapper[4886]: E0129 17:44:34.639122 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:44:41 crc kubenswrapper[4886]: E0129 17:44:41.761632 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 17:44:41 crc kubenswrapper[4886]: E0129 17:44:41.762649 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4d4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sqs8b_openshift-marketplace(d8da04de-c293-46ce-aeae-b2081be3c077): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:44:41 crc kubenswrapper[4886]: E0129 17:44:41.764808 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:44:45 crc kubenswrapper[4886]: E0129 17:44:45.619610 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:44:47 crc kubenswrapper[4886]: E0129 17:44:47.617865 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:44:52 crc kubenswrapper[4886]: E0129 17:44:52.618360 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:44:59 crc kubenswrapper[4886]: E0129 17:44:59.619180 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:44:59 crc kubenswrapper[4886]: I0129 17:44:59.661314 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:44:59 crc kubenswrapper[4886]: I0129 17:44:59.661438 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:44:59 crc kubenswrapper[4886]: I0129 17:44:59.661504 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:44:59 crc kubenswrapper[4886]: I0129 17:44:59.662776 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08c1b8c3edabbeb571f6803cae251f6a7919758b2342154da4b61975a4b2aba4"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:44:59 crc kubenswrapper[4886]: I0129 17:44:59.662887 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://08c1b8c3edabbeb571f6803cae251f6a7919758b2342154da4b61975a4b2aba4" gracePeriod=600 Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.159553 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc"] Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.161927 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.168320 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.168439 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.178442 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc"] Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.190172 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-config-volume\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.190587 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-secret-volume\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.190785 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57n8\" (UniqueName: \"kubernetes.io/projected/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-kube-api-access-p57n8\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.292696 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57n8\" (UniqueName: \"kubernetes.io/projected/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-kube-api-access-p57n8\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.293105 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-config-volume\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.293253 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-secret-volume\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.294262 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-config-volume\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.301991 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-secret-volume\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.314038 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57n8\" (UniqueName: \"kubernetes.io/projected/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-kube-api-access-p57n8\") pod \"collect-profiles-29495145-zz7lc\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.515100 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.591152 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="08c1b8c3edabbeb571f6803cae251f6a7919758b2342154da4b61975a4b2aba4" exitCode=0 Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.591201 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"08c1b8c3edabbeb571f6803cae251f6a7919758b2342154da4b61975a4b2aba4"} Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.591231 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3"} Jan 29 17:45:00 crc kubenswrapper[4886]: I0129 17:45:00.591251 4886 scope.go:117] "RemoveContainer" containerID="4fb3b6296c9f652ca771a622cb99f2be698815449622de5c6a6f7a03eb63e93a" Jan 29 17:45:00 crc kubenswrapper[4886]: E0129 17:45:00.620603 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:45:01 crc kubenswrapper[4886]: I0129 17:45:01.133598 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc"] Jan 29 17:45:01 crc kubenswrapper[4886]: W0129 17:45:01.137562 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb23a9dc7_9e89_4743_9e23_ca27f59fb5e2.slice/crio-94e0d2a7195faec04f61ebf925d4ee5488545ccc1559c385ba7bac3c04f5927e WatchSource:0}: Error finding container 94e0d2a7195faec04f61ebf925d4ee5488545ccc1559c385ba7bac3c04f5927e: Status 404 returned error can't find the container with id 94e0d2a7195faec04f61ebf925d4ee5488545ccc1559c385ba7bac3c04f5927e Jan 29 17:45:01 crc kubenswrapper[4886]: I0129 17:45:01.622360 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" event={"ID":"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2","Type":"ContainerStarted","Data":"2fcb97adc449db7399cd2957592ff329785589715e5cee0d9163a663d660a4ec"} Jan 29 17:45:01 crc kubenswrapper[4886]: I0129 17:45:01.622728 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" event={"ID":"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2","Type":"ContainerStarted","Data":"94e0d2a7195faec04f61ebf925d4ee5488545ccc1559c385ba7bac3c04f5927e"} Jan 29 17:45:01 crc kubenswrapper[4886]: I0129 17:45:01.657640 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" podStartSLOduration=1.657616215 podStartE2EDuration="1.657616215s" podCreationTimestamp="2026-01-29 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 17:45:01.647365965 +0000 UTC m=+4984.556085277" watchObservedRunningTime="2026-01-29 17:45:01.657616215 +0000 UTC m=+4984.566335497" Jan 29 17:45:02 crc kubenswrapper[4886]: I0129 17:45:02.680576 4886 generic.go:334] "Generic (PLEG): container finished" podID="b23a9dc7-9e89-4743-9e23-ca27f59fb5e2" containerID="2fcb97adc449db7399cd2957592ff329785589715e5cee0d9163a663d660a4ec" exitCode=0 Jan 29 17:45:02 crc kubenswrapper[4886]: I0129 17:45:02.680787 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" event={"ID":"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2","Type":"ContainerDied","Data":"2fcb97adc449db7399cd2957592ff329785589715e5cee0d9163a663d660a4ec"} Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.715128 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" event={"ID":"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2","Type":"ContainerDied","Data":"94e0d2a7195faec04f61ebf925d4ee5488545ccc1559c385ba7bac3c04f5927e"} Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.715767 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e0d2a7195faec04f61ebf925d4ee5488545ccc1559c385ba7bac3c04f5927e" Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.720066 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.829448 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-secret-volume\") pod \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.829803 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p57n8\" (UniqueName: \"kubernetes.io/projected/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-kube-api-access-p57n8\") pod \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.830119 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-config-volume\") pod \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\" (UID: \"b23a9dc7-9e89-4743-9e23-ca27f59fb5e2\") " Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.830628 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-config-volume" (OuterVolumeSpecName: "config-volume") pod "b23a9dc7-9e89-4743-9e23-ca27f59fb5e2" (UID: "b23a9dc7-9e89-4743-9e23-ca27f59fb5e2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.831046 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.860424 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b23a9dc7-9e89-4743-9e23-ca27f59fb5e2" (UID: "b23a9dc7-9e89-4743-9e23-ca27f59fb5e2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.861068 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-kube-api-access-p57n8" (OuterVolumeSpecName: "kube-api-access-p57n8") pod "b23a9dc7-9e89-4743-9e23-ca27f59fb5e2" (UID: "b23a9dc7-9e89-4743-9e23-ca27f59fb5e2"). InnerVolumeSpecName "kube-api-access-p57n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.934376 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 17:45:04 crc kubenswrapper[4886]: I0129 17:45:04.934428 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p57n8\" (UniqueName: \"kubernetes.io/projected/b23a9dc7-9e89-4743-9e23-ca27f59fb5e2-kube-api-access-p57n8\") on node \"crc\" DevicePath \"\"" Jan 29 17:45:05 crc kubenswrapper[4886]: I0129 17:45:05.729014 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495145-zz7lc" Jan 29 17:45:05 crc kubenswrapper[4886]: I0129 17:45:05.823626 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666"] Jan 29 17:45:05 crc kubenswrapper[4886]: I0129 17:45:05.835555 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495100-wk666"] Jan 29 17:45:06 crc kubenswrapper[4886]: I0129 17:45:06.636983 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da2d212-de01-458b-9805-8eb21ed83324" path="/var/lib/kubelet/pods/3da2d212-de01-458b-9805-8eb21ed83324/volumes" Jan 29 17:45:07 crc kubenswrapper[4886]: E0129 17:45:07.617760 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:45:11 crc kubenswrapper[4886]: E0129 17:45:11.619961 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:45:13 crc kubenswrapper[4886]: E0129 17:45:13.619201 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:45:19 crc kubenswrapper[4886]: E0129 17:45:19.619493 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:45:23 crc kubenswrapper[4886]: E0129 17:45:23.619503 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:45:28 crc kubenswrapper[4886]: E0129 17:45:28.632583 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:45:33 crc kubenswrapper[4886]: E0129 17:45:33.618603 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:45:38 crc kubenswrapper[4886]: E0129 17:45:38.633173 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:45:41 crc kubenswrapper[4886]: E0129 17:45:41.617486 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:45:45 crc kubenswrapper[4886]: E0129 17:45:45.619517 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:45:52 crc kubenswrapper[4886]: E0129 17:45:52.618203 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:45:54 crc kubenswrapper[4886]: E0129 17:45:54.618291 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:45:55 crc kubenswrapper[4886]: I0129 17:45:55.916823 4886 scope.go:117] "RemoveContainer" containerID="3f2a5d53f1118cb99d6ac0f75863b8e8419b33babb29267642e06437ed3d61f8" Jan 29 17:45:59 crc kubenswrapper[4886]: E0129 17:45:59.619002 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:46:07 crc kubenswrapper[4886]: E0129 17:46:07.764668 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:46:07 crc kubenswrapper[4886]: E0129 17:46:07.765548 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlxp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qsjfd_openshift-marketplace(7ceed770-f253-4044-92f0-c8a07b89b621): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:46:07 crc kubenswrapper[4886]: E0129 17:46:07.766834 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:46:08 crc kubenswrapper[4886]: E0129 17:46:08.633262 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:46:11 crc kubenswrapper[4886]: E0129 17:46:11.618748 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:46:18 crc kubenswrapper[4886]: E0129 17:46:18.635585 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:46:22 crc kubenswrapper[4886]: E0129 17:46:22.622441 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:46:22 crc kubenswrapper[4886]: E0129 17:46:22.623458 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:46:31 crc kubenswrapper[4886]: E0129 17:46:31.619898 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:46:34 crc kubenswrapper[4886]: E0129 17:46:34.619442 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:46:35 crc kubenswrapper[4886]: E0129 17:46:35.617945 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:46:44 crc kubenswrapper[4886]: E0129 17:46:44.619218 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:46:46 crc kubenswrapper[4886]: E0129 17:46:46.620625 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:46:48 crc kubenswrapper[4886]: E0129 17:46:48.633468 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:46:59 crc kubenswrapper[4886]: E0129 17:46:59.620766 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:47:00 crc kubenswrapper[4886]: E0129 17:47:00.621205 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:47:00 crc kubenswrapper[4886]: E0129 17:47:00.621292 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:47:11 crc kubenswrapper[4886]: E0129 17:47:11.617443 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:47:12 crc kubenswrapper[4886]: E0129 17:47:12.620413 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:47:13 crc kubenswrapper[4886]: E0129 17:47:13.617299 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:47:22 crc kubenswrapper[4886]: E0129 17:47:22.620190 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:47:25 crc kubenswrapper[4886]: E0129 17:47:25.617407 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:47:28 crc kubenswrapper[4886]: E0129 17:47:28.639768 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:47:29 crc kubenswrapper[4886]: I0129 17:47:29.660681 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:47:29 crc kubenswrapper[4886]: I0129 17:47:29.661563 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:47:37 crc kubenswrapper[4886]: E0129 17:47:37.618776 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:47:40 crc kubenswrapper[4886]: E0129 17:47:40.617808 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:47:40 crc kubenswrapper[4886]: E0129 17:47:40.617872 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:47:50 crc kubenswrapper[4886]: E0129 17:47:50.736802 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:47:52 crc kubenswrapper[4886]: E0129 17:47:52.623634 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:47:52 crc kubenswrapper[4886]: E0129 17:47:52.623653 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:47:59 crc kubenswrapper[4886]: I0129 17:47:59.660794 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:47:59 crc kubenswrapper[4886]: I0129 17:47:59.661763 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.425609 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llsc9"] Jan 29 17:48:03 crc kubenswrapper[4886]: E0129 17:48:03.429665 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23a9dc7-9e89-4743-9e23-ca27f59fb5e2" containerName="collect-profiles" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.429694 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23a9dc7-9e89-4743-9e23-ca27f59fb5e2" containerName="collect-profiles" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.434843 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23a9dc7-9e89-4743-9e23-ca27f59fb5e2" containerName="collect-profiles" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.438571 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.456005 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llsc9"] Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.537189 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-utilities\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.537694 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-catalog-content\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.537765 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hsm\" (UniqueName: \"kubernetes.io/projected/40bcd274-ae24-4057-aa88-40fd76936d1f-kube-api-access-r2hsm\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: E0129 17:48:03.618076 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.639779 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-catalog-content\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.639877 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hsm\" (UniqueName: \"kubernetes.io/projected/40bcd274-ae24-4057-aa88-40fd76936d1f-kube-api-access-r2hsm\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.639936 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-utilities\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.640649 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-catalog-content\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.640721 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-utilities\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.670248 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hsm\" (UniqueName: \"kubernetes.io/projected/40bcd274-ae24-4057-aa88-40fd76936d1f-kube-api-access-r2hsm\") pod \"community-operators-llsc9\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:03 crc kubenswrapper[4886]: I0129 17:48:03.771983 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:48:04 crc kubenswrapper[4886]: I0129 17:48:04.343697 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llsc9"] Jan 29 17:48:04 crc kubenswrapper[4886]: E0129 17:48:04.616831 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:48:04 crc kubenswrapper[4886]: E0129 17:48:04.617598 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:48:05 crc kubenswrapper[4886]: I0129 17:48:05.082785 4886 generic.go:334] "Generic (PLEG): container finished" podID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerID="df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda" exitCode=0 Jan 29 17:48:05 crc kubenswrapper[4886]: I0129 17:48:05.082831 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsc9" event={"ID":"40bcd274-ae24-4057-aa88-40fd76936d1f","Type":"ContainerDied","Data":"df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda"} Jan 29 17:48:05 crc kubenswrapper[4886]: I0129 17:48:05.082861 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsc9" event={"ID":"40bcd274-ae24-4057-aa88-40fd76936d1f","Type":"ContainerStarted","Data":"a88bf7d409b3544d3199be1655f238f3723fa051005797691698fbfffff6a736"} Jan 29 17:48:05 crc kubenswrapper[4886]: E0129 17:48:05.259533 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 17:48:05 crc kubenswrapper[4886]: E0129 17:48:05.260106 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2hsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-llsc9_openshift-marketplace(40bcd274-ae24-4057-aa88-40fd76936d1f): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:48:05 crc kubenswrapper[4886]: E0129 17:48:05.261589 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" Jan 29 17:48:06 crc kubenswrapper[4886]: E0129 17:48:06.099197 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" Jan 29 17:48:15 crc kubenswrapper[4886]: E0129 17:48:15.618839 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:48:17 crc kubenswrapper[4886]: E0129 17:48:17.619231 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:48:19 crc kubenswrapper[4886]: E0129 17:48:19.641860 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:48:20 crc kubenswrapper[4886]: E0129 17:48:20.765569 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 17:48:20 crc kubenswrapper[4886]: E0129 17:48:20.765745 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2hsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-llsc9_openshift-marketplace(40bcd274-ae24-4057-aa88-40fd76936d1f): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:48:20 crc kubenswrapper[4886]: E0129 17:48:20.767006 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" Jan 29 17:48:29 crc kubenswrapper[4886]: E0129 17:48:29.617466 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:48:29 crc kubenswrapper[4886]: I0129 17:48:29.661429 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:48:29 crc kubenswrapper[4886]: I0129 17:48:29.661501 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:48:29 crc kubenswrapper[4886]: I0129 17:48:29.661554 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:48:29 crc kubenswrapper[4886]: I0129 17:48:29.662717 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:48:29 crc kubenswrapper[4886]: I0129 17:48:29.662796 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" gracePeriod=600 Jan 29 17:48:29 crc kubenswrapper[4886]: E0129 17:48:29.812738 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:48:30 crc kubenswrapper[4886]: I0129 17:48:30.399872 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" exitCode=0 Jan 29 17:48:30 crc kubenswrapper[4886]: I0129 17:48:30.399920 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3"} Jan 29 17:48:30 crc kubenswrapper[4886]: I0129 17:48:30.399956 4886 scope.go:117] "RemoveContainer" containerID="08c1b8c3edabbeb571f6803cae251f6a7919758b2342154da4b61975a4b2aba4" Jan 29 17:48:30 crc kubenswrapper[4886]: I0129 17:48:30.400728 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:48:30 crc kubenswrapper[4886]: E0129 17:48:30.401026 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:48:32 crc kubenswrapper[4886]: E0129 17:48:32.621653 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:48:32 crc kubenswrapper[4886]: E0129 17:48:32.621696 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:48:33 crc kubenswrapper[4886]: E0129 17:48:33.620680 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" Jan 29 17:48:44 crc kubenswrapper[4886]: I0129 17:48:44.615967 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:48:44 crc kubenswrapper[4886]: E0129 17:48:44.617031 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:48:44 crc kubenswrapper[4886]: E0129 17:48:44.620057 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:48:44 crc kubenswrapper[4886]: E0129 17:48:44.620115 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:48:47 crc kubenswrapper[4886]: E0129 17:48:47.622149 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:48:48 crc kubenswrapper[4886]: E0129 17:48:48.851448 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 17:48:48 crc kubenswrapper[4886]: E0129 17:48:48.852178 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2hsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-llsc9_openshift-marketplace(40bcd274-ae24-4057-aa88-40fd76936d1f): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:48:48 crc kubenswrapper[4886]: E0129 17:48:48.853408 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" Jan 29 17:48:55 crc kubenswrapper[4886]: I0129 17:48:55.616399 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:48:55 crc kubenswrapper[4886]: E0129 17:48:55.617799 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:48:55 crc kubenswrapper[4886]: E0129 17:48:55.619209 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:48:59 crc kubenswrapper[4886]: E0129 17:48:59.618068 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:49:00 crc kubenswrapper[4886]: E0129 17:49:00.620033 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" Jan 29 17:49:02 crc kubenswrapper[4886]: E0129 17:49:02.620761 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:49:09 crc kubenswrapper[4886]: I0129 17:49:09.616107 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:49:09 crc kubenswrapper[4886]: E0129 17:49:09.617542 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:49:09 crc kubenswrapper[4886]: E0129 17:49:09.618733 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:49:12 crc kubenswrapper[4886]: E0129 17:49:12.624652 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:49:15 crc kubenswrapper[4886]: E0129 17:49:15.619596 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" Jan 29 17:49:16 crc kubenswrapper[4886]: E0129 17:49:16.618005 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:49:22 crc kubenswrapper[4886]: I0129 17:49:22.615281 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:49:22 crc kubenswrapper[4886]: E0129 17:49:22.616456 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:49:23 crc kubenswrapper[4886]: E0129 17:49:23.617811 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:49:26 crc kubenswrapper[4886]: E0129 17:49:26.619124 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:49:27 crc kubenswrapper[4886]: E0129 17:49:27.617993 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" Jan 29 17:49:29 crc kubenswrapper[4886]: E0129 17:49:29.618508 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" Jan 29 17:49:34 crc kubenswrapper[4886]: E0129 17:49:34.618993 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" Jan 29 17:49:36 crc kubenswrapper[4886]: I0129 17:49:36.615552 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:49:36 crc kubenswrapper[4886]: E0129 17:49:36.616614 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:49:37 crc kubenswrapper[4886]: E0129 17:49:37.644477 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:49:39 crc kubenswrapper[4886]: I0129 17:49:39.618077 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:49:41 crc kubenswrapper[4886]: I0129 17:49:41.456256 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsc9" event={"ID":"40bcd274-ae24-4057-aa88-40fd76936d1f","Type":"ContainerStarted","Data":"6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896"} Jan 29 17:49:42 crc kubenswrapper[4886]: I0129 17:49:42.466347 4886 generic.go:334] "Generic (PLEG): container finished" podID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerID="6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896" exitCode=0 Jan 29 17:49:42 crc kubenswrapper[4886]: I0129 17:49:42.466481 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsc9" event={"ID":"40bcd274-ae24-4057-aa88-40fd76936d1f","Type":"ContainerDied","Data":"6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896"} Jan 29 17:49:43 crc kubenswrapper[4886]: I0129 17:49:43.483033 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsc9" event={"ID":"40bcd274-ae24-4057-aa88-40fd76936d1f","Type":"ContainerStarted","Data":"89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce"} Jan 29 17:49:43 crc kubenswrapper[4886]: I0129 17:49:43.527538 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llsc9" podStartSLOduration=2.5076267249999997 podStartE2EDuration="1m40.52751156s" podCreationTimestamp="2026-01-29 17:48:03 +0000 UTC" firstStartedPulling="2026-01-29 17:48:05.085582003 +0000 UTC m=+5167.994301285" lastFinishedPulling="2026-01-29 17:49:43.105466808 +0000 UTC m=+5266.014186120" observedRunningTime="2026-01-29 17:49:43.512919877 +0000 UTC m=+5266.421639189" watchObservedRunningTime="2026-01-29 17:49:43.52751156 +0000 UTC m=+5266.436230872" Jan 29 17:49:43 crc kubenswrapper[4886]: I0129 17:49:43.772545 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:49:43 crc kubenswrapper[4886]: I0129 17:49:43.772595 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:49:44 crc kubenswrapper[4886]: I0129 17:49:44.494144 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bw7c" event={"ID":"c566a66d-f66d-457d-80eb-a0cf5bf4e013","Type":"ContainerStarted","Data":"048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0"} Jan 29 17:49:44 crc kubenswrapper[4886]: I0129 17:49:44.839218 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="registry-server" probeResult="failure" output=< Jan 29 17:49:44 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:49:44 crc kubenswrapper[4886]: > Jan 29 17:49:48 crc kubenswrapper[4886]: I0129 17:49:48.631852 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:49:48 crc kubenswrapper[4886]: E0129 17:49:48.633145 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:49:51 crc kubenswrapper[4886]: I0129 17:49:51.653584 4886 generic.go:334] "Generic (PLEG): container finished" podID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerID="048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0" exitCode=0 Jan 29 17:49:51 crc kubenswrapper[4886]: I0129 17:49:51.653705 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bw7c" event={"ID":"c566a66d-f66d-457d-80eb-a0cf5bf4e013","Type":"ContainerDied","Data":"048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0"} Jan 29 17:49:51 crc kubenswrapper[4886]: I0129 17:49:51.659158 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqs8b" event={"ID":"d8da04de-c293-46ce-aeae-b2081be3c077","Type":"ContainerStarted","Data":"d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7"} Jan 29 17:49:52 crc kubenswrapper[4886]: E0129 17:49:52.618003 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:49:53 crc kubenswrapper[4886]: I0129 17:49:53.855044 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:49:53 crc kubenswrapper[4886]: I0129 17:49:53.922709 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:49:54 crc kubenswrapper[4886]: I0129 17:49:54.105698 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llsc9"] Jan 29 17:49:54 crc kubenswrapper[4886]: I0129 17:49:54.751364 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bw7c" event={"ID":"c566a66d-f66d-457d-80eb-a0cf5bf4e013","Type":"ContainerStarted","Data":"bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba"} Jan 29 17:49:54 crc kubenswrapper[4886]: I0129 17:49:54.790277 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7bw7c" podStartSLOduration=3.95254472 podStartE2EDuration="11m3.790245022s" podCreationTimestamp="2026-01-29 17:38:51 +0000 UTC" firstStartedPulling="2026-01-29 17:38:53.777547776 +0000 UTC m=+4616.686267078" lastFinishedPulling="2026-01-29 17:49:53.615248098 +0000 UTC m=+5276.523967380" observedRunningTime="2026-01-29 17:49:54.773622811 +0000 UTC m=+5277.682342123" watchObservedRunningTime="2026-01-29 17:49:54.790245022 +0000 UTC m=+5277.698964334" Jan 29 17:49:55 crc kubenswrapper[4886]: I0129 17:49:55.764605 4886 generic.go:334] "Generic (PLEG): container finished" podID="d8da04de-c293-46ce-aeae-b2081be3c077" containerID="d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7" exitCode=0 Jan 29 17:49:55 crc kubenswrapper[4886]: I0129 17:49:55.764698 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqs8b" event={"ID":"d8da04de-c293-46ce-aeae-b2081be3c077","Type":"ContainerDied","Data":"d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7"} Jan 29 17:49:55 crc kubenswrapper[4886]: I0129 17:49:55.765096 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-llsc9" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="registry-server" containerID="cri-o://89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce" gracePeriod=2 Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.386014 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.434870 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2hsm\" (UniqueName: \"kubernetes.io/projected/40bcd274-ae24-4057-aa88-40fd76936d1f-kube-api-access-r2hsm\") pod \"40bcd274-ae24-4057-aa88-40fd76936d1f\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.435216 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-catalog-content\") pod \"40bcd274-ae24-4057-aa88-40fd76936d1f\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.435251 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-utilities\") pod \"40bcd274-ae24-4057-aa88-40fd76936d1f\" (UID: \"40bcd274-ae24-4057-aa88-40fd76936d1f\") " Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.437121 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-utilities" (OuterVolumeSpecName: "utilities") pod "40bcd274-ae24-4057-aa88-40fd76936d1f" (UID: "40bcd274-ae24-4057-aa88-40fd76936d1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.455075 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40bcd274-ae24-4057-aa88-40fd76936d1f-kube-api-access-r2hsm" (OuterVolumeSpecName: "kube-api-access-r2hsm") pod "40bcd274-ae24-4057-aa88-40fd76936d1f" (UID: "40bcd274-ae24-4057-aa88-40fd76936d1f"). InnerVolumeSpecName "kube-api-access-r2hsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.495872 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40bcd274-ae24-4057-aa88-40fd76936d1f" (UID: "40bcd274-ae24-4057-aa88-40fd76936d1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.538010 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.538316 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bcd274-ae24-4057-aa88-40fd76936d1f-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.538351 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2hsm\" (UniqueName: \"kubernetes.io/projected/40bcd274-ae24-4057-aa88-40fd76936d1f-kube-api-access-r2hsm\") on node \"crc\" DevicePath \"\"" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.774676 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqs8b" event={"ID":"d8da04de-c293-46ce-aeae-b2081be3c077","Type":"ContainerStarted","Data":"ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486"} Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.777189 4886 generic.go:334] "Generic (PLEG): container finished" podID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerID="89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce" exitCode=0 Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.777217 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsc9" event={"ID":"40bcd274-ae24-4057-aa88-40fd76936d1f","Type":"ContainerDied","Data":"89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce"} Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.777262 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsc9" event={"ID":"40bcd274-ae24-4057-aa88-40fd76936d1f","Type":"ContainerDied","Data":"a88bf7d409b3544d3199be1655f238f3723fa051005797691698fbfffff6a736"} Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.777288 4886 scope.go:117] "RemoveContainer" containerID="89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.777306 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llsc9" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.799633 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sqs8b" podStartSLOduration=2.549989153 podStartE2EDuration="10m52.799616773s" podCreationTimestamp="2026-01-29 17:39:04 +0000 UTC" firstStartedPulling="2026-01-29 17:39:05.948449894 +0000 UTC m=+4628.857169166" lastFinishedPulling="2026-01-29 17:49:56.198077514 +0000 UTC m=+5279.106796786" observedRunningTime="2026-01-29 17:49:56.794854238 +0000 UTC m=+5279.703573510" watchObservedRunningTime="2026-01-29 17:49:56.799616773 +0000 UTC m=+5279.708336035" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.802158 4886 scope.go:117] "RemoveContainer" containerID="6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.828679 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llsc9"] Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.830738 4886 scope.go:117] "RemoveContainer" containerID="df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.842241 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-llsc9"] Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.862865 4886 scope.go:117] "RemoveContainer" containerID="89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce" Jan 29 17:49:56 crc kubenswrapper[4886]: E0129 17:49:56.863348 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce\": container with ID starting with 89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce not found: ID does not exist" containerID="89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.863390 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce"} err="failed to get container status \"89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce\": rpc error: code = NotFound desc = could not find container \"89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce\": container with ID starting with 89e9dc84363622541fc28235465288f22f58590f88406e532aef6fc87edbacce not found: ID does not exist" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.863422 4886 scope.go:117] "RemoveContainer" containerID="6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896" Jan 29 17:49:56 crc kubenswrapper[4886]: E0129 17:49:56.863823 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896\": container with ID starting with 6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896 not found: ID does not exist" containerID="6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.863870 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896"} err="failed to get container status \"6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896\": rpc error: code = NotFound desc = could not find container \"6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896\": container with ID starting with 6b9ab19ac11d0ebafbaa4deb030a38d941beeb1b3864ce3572f358b5cd58f896 not found: ID does not exist" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.863899 4886 scope.go:117] "RemoveContainer" containerID="df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda" Jan 29 17:49:56 crc kubenswrapper[4886]: E0129 17:49:56.864165 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda\": container with ID starting with df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda not found: ID does not exist" containerID="df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda" Jan 29 17:49:56 crc kubenswrapper[4886]: I0129 17:49:56.864186 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda"} err="failed to get container status \"df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda\": rpc error: code = NotFound desc = could not find container \"df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda\": container with ID starting with df37f7c356fb768cbf7232ec3398b6f87349466aec6de2b10e5c22d7da6bdbda not found: ID does not exist" Jan 29 17:49:58 crc kubenswrapper[4886]: I0129 17:49:58.630676 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" path="/var/lib/kubelet/pods/40bcd274-ae24-4057-aa88-40fd76936d1f/volumes" Jan 29 17:50:02 crc kubenswrapper[4886]: I0129 17:50:02.034269 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:50:02 crc kubenswrapper[4886]: I0129 17:50:02.035035 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:50:03 crc kubenswrapper[4886]: I0129 17:50:03.109863 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="registry-server" probeResult="failure" output=< Jan 29 17:50:03 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:50:03 crc kubenswrapper[4886]: > Jan 29 17:50:03 crc kubenswrapper[4886]: I0129 17:50:03.615728 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:50:03 crc kubenswrapper[4886]: E0129 17:50:03.617223 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:50:03 crc kubenswrapper[4886]: E0129 17:50:03.617783 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:50:04 crc kubenswrapper[4886]: I0129 17:50:04.512892 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:50:04 crc kubenswrapper[4886]: I0129 17:50:04.512967 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:50:04 crc kubenswrapper[4886]: I0129 17:50:04.610936 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:50:04 crc kubenswrapper[4886]: I0129 17:50:04.966617 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:50:05 crc kubenswrapper[4886]: I0129 17:50:05.044093 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqs8b"] Jan 29 17:50:06 crc kubenswrapper[4886]: I0129 17:50:06.930274 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sqs8b" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" containerName="registry-server" containerID="cri-o://ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486" gracePeriod=2 Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.692033 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.715603 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4d4z\" (UniqueName: \"kubernetes.io/projected/d8da04de-c293-46ce-aeae-b2081be3c077-kube-api-access-q4d4z\") pod \"d8da04de-c293-46ce-aeae-b2081be3c077\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.715662 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-utilities\") pod \"d8da04de-c293-46ce-aeae-b2081be3c077\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.715940 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-catalog-content\") pod \"d8da04de-c293-46ce-aeae-b2081be3c077\" (UID: \"d8da04de-c293-46ce-aeae-b2081be3c077\") " Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.718026 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-utilities" (OuterVolumeSpecName: "utilities") pod "d8da04de-c293-46ce-aeae-b2081be3c077" (UID: "d8da04de-c293-46ce-aeae-b2081be3c077"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.739681 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8da04de-c293-46ce-aeae-b2081be3c077-kube-api-access-q4d4z" (OuterVolumeSpecName: "kube-api-access-q4d4z") pod "d8da04de-c293-46ce-aeae-b2081be3c077" (UID: "d8da04de-c293-46ce-aeae-b2081be3c077"). InnerVolumeSpecName "kube-api-access-q4d4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.749287 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8da04de-c293-46ce-aeae-b2081be3c077" (UID: "d8da04de-c293-46ce-aeae-b2081be3c077"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.819108 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.819137 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4d4z\" (UniqueName: \"kubernetes.io/projected/d8da04de-c293-46ce-aeae-b2081be3c077-kube-api-access-q4d4z\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.819147 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8da04de-c293-46ce-aeae-b2081be3c077-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.945575 4886 generic.go:334] "Generic (PLEG): container finished" podID="d8da04de-c293-46ce-aeae-b2081be3c077" containerID="ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486" exitCode=0 Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.945625 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqs8b" event={"ID":"d8da04de-c293-46ce-aeae-b2081be3c077","Type":"ContainerDied","Data":"ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486"} Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.945653 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqs8b" event={"ID":"d8da04de-c293-46ce-aeae-b2081be3c077","Type":"ContainerDied","Data":"efefc164eab7dbbf5bc524a94050b180180d68604ff2396211c4fb6aee8d9fad"} Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.945670 4886 scope.go:117] "RemoveContainer" containerID="ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.945805 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqs8b" Jan 29 17:50:07 crc kubenswrapper[4886]: I0129 17:50:07.990309 4886 scope.go:117] "RemoveContainer" containerID="d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7" Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.001871 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqs8b"] Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.011693 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqs8b"] Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.014686 4886 scope.go:117] "RemoveContainer" containerID="95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22" Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.090457 4886 scope.go:117] "RemoveContainer" containerID="ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486" Jan 29 17:50:08 crc kubenswrapper[4886]: E0129 17:50:08.091129 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486\": container with ID starting with ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486 not found: ID does not exist" containerID="ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486" Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.091178 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486"} err="failed to get container status \"ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486\": rpc error: code = NotFound desc = could not find container \"ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486\": container with ID starting with ce0a05bd3d497a8a4069a0652d1a5685775d958d3b77b12a4fb2cd4858595486 not found: ID does not exist" Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.091206 4886 scope.go:117] "RemoveContainer" containerID="d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7" Jan 29 17:50:08 crc kubenswrapper[4886]: E0129 17:50:08.091915 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7\": container with ID starting with d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7 not found: ID does not exist" containerID="d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7" Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.091951 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7"} err="failed to get container status \"d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7\": rpc error: code = NotFound desc = could not find container \"d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7\": container with ID starting with d74818ab52ae29443c5955bc1974c1dbb7212a33d4252b321985fe8bb4f905d7 not found: ID does not exist" Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.091985 4886 scope.go:117] "RemoveContainer" containerID="95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22" Jan 29 17:50:08 crc kubenswrapper[4886]: E0129 17:50:08.092576 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22\": container with ID starting with 95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22 not found: ID does not exist" containerID="95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22" Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.092605 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22"} err="failed to get container status \"95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22\": rpc error: code = NotFound desc = could not find container \"95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22\": container with ID starting with 95fe5d5ec1cc0c1d3c6bdcb2b0f28f4b7f72e0b8cf33d409b80c3bfccdde3d22 not found: ID does not exist" Jan 29 17:50:08 crc kubenswrapper[4886]: I0129 17:50:08.631109 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" path="/var/lib/kubelet/pods/d8da04de-c293-46ce-aeae-b2081be3c077/volumes" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.267299 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lrwxm"] Jan 29 17:50:09 crc kubenswrapper[4886]: E0129 17:50:09.268006 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" containerName="extract-utilities" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.268022 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" containerName="extract-utilities" Jan 29 17:50:09 crc kubenswrapper[4886]: E0129 17:50:09.268043 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="extract-utilities" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.268051 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="extract-utilities" Jan 29 17:50:09 crc kubenswrapper[4886]: E0129 17:50:09.268078 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="extract-content" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.268086 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="extract-content" Jan 29 17:50:09 crc kubenswrapper[4886]: E0129 17:50:09.268119 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" containerName="extract-content" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.268127 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" containerName="extract-content" Jan 29 17:50:09 crc kubenswrapper[4886]: E0129 17:50:09.268138 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" containerName="registry-server" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.268145 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" containerName="registry-server" Jan 29 17:50:09 crc kubenswrapper[4886]: E0129 17:50:09.268160 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="registry-server" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.268167 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="registry-server" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.268504 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8da04de-c293-46ce-aeae-b2081be3c077" containerName="registry-server" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.268522 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="40bcd274-ae24-4057-aa88-40fd76936d1f" containerName="registry-server" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.270764 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.309152 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrwxm"] Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.360565 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ph9l\" (UniqueName: \"kubernetes.io/projected/54c33a8c-623a-409c-8586-7b4c3c1c0510-kube-api-access-2ph9l\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.360697 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-utilities\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.361149 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-catalog-content\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.464287 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-catalog-content\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.464628 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ph9l\" (UniqueName: \"kubernetes.io/projected/54c33a8c-623a-409c-8586-7b4c3c1c0510-kube-api-access-2ph9l\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.464727 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-utilities\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.464993 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-catalog-content\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:09 crc kubenswrapper[4886]: I0129 17:50:09.465278 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-utilities\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:10 crc kubenswrapper[4886]: I0129 17:50:10.263616 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ph9l\" (UniqueName: \"kubernetes.io/projected/54c33a8c-623a-409c-8586-7b4c3c1c0510-kube-api-access-2ph9l\") pod \"redhat-marketplace-lrwxm\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:10 crc kubenswrapper[4886]: I0129 17:50:10.503541 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:11 crc kubenswrapper[4886]: I0129 17:50:11.119752 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrwxm"] Jan 29 17:50:12 crc kubenswrapper[4886]: I0129 17:50:12.037085 4886 generic.go:334] "Generic (PLEG): container finished" podID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerID="8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c" exitCode=0 Jan 29 17:50:12 crc kubenswrapper[4886]: I0129 17:50:12.037489 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrwxm" event={"ID":"54c33a8c-623a-409c-8586-7b4c3c1c0510","Type":"ContainerDied","Data":"8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c"} Jan 29 17:50:12 crc kubenswrapper[4886]: I0129 17:50:12.037569 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrwxm" event={"ID":"54c33a8c-623a-409c-8586-7b4c3c1c0510","Type":"ContainerStarted","Data":"cef715b4d1f263de7fa710e1aeabb63fa13c49ab6f66feea0bfb2c6c3415b7ca"} Jan 29 17:50:13 crc kubenswrapper[4886]: I0129 17:50:13.048622 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrwxm" event={"ID":"54c33a8c-623a-409c-8586-7b4c3c1c0510","Type":"ContainerStarted","Data":"a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a"} Jan 29 17:50:13 crc kubenswrapper[4886]: I0129 17:50:13.089957 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="registry-server" probeResult="failure" output=< Jan 29 17:50:13 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 17:50:13 crc kubenswrapper[4886]: > Jan 29 17:50:14 crc kubenswrapper[4886]: I0129 17:50:14.065832 4886 generic.go:334] "Generic (PLEG): container finished" podID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerID="a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a" exitCode=0 Jan 29 17:50:14 crc kubenswrapper[4886]: I0129 17:50:14.066056 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrwxm" event={"ID":"54c33a8c-623a-409c-8586-7b4c3c1c0510","Type":"ContainerDied","Data":"a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a"} Jan 29 17:50:15 crc kubenswrapper[4886]: I0129 17:50:15.084726 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrwxm" event={"ID":"54c33a8c-623a-409c-8586-7b4c3c1c0510","Type":"ContainerStarted","Data":"27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45"} Jan 29 17:50:17 crc kubenswrapper[4886]: I0129 17:50:17.616681 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:50:17 crc kubenswrapper[4886]: E0129 17:50:17.617954 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:50:18 crc kubenswrapper[4886]: E0129 17:50:18.635526 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:50:20 crc kubenswrapper[4886]: I0129 17:50:20.503848 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:20 crc kubenswrapper[4886]: I0129 17:50:20.504204 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:20 crc kubenswrapper[4886]: I0129 17:50:20.603938 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:20 crc kubenswrapper[4886]: I0129 17:50:20.650942 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lrwxm" podStartSLOduration=9.171257753 podStartE2EDuration="11.650914984s" podCreationTimestamp="2026-01-29 17:50:09 +0000 UTC" firstStartedPulling="2026-01-29 17:50:12.041390533 +0000 UTC m=+5294.950109825" lastFinishedPulling="2026-01-29 17:50:14.521047764 +0000 UTC m=+5297.429767056" observedRunningTime="2026-01-29 17:50:15.116718847 +0000 UTC m=+5298.025438119" watchObservedRunningTime="2026-01-29 17:50:20.650914984 +0000 UTC m=+5303.559634296" Jan 29 17:50:21 crc kubenswrapper[4886]: I0129 17:50:21.228360 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:21 crc kubenswrapper[4886]: I0129 17:50:21.276235 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrwxm"] Jan 29 17:50:22 crc kubenswrapper[4886]: I0129 17:50:22.548677 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:50:22 crc kubenswrapper[4886]: I0129 17:50:22.641340 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.174882 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lrwxm" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerName="registry-server" containerID="cri-o://27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45" gracePeriod=2 Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.263588 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7bw7c"] Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.779361 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.884001 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ph9l\" (UniqueName: \"kubernetes.io/projected/54c33a8c-623a-409c-8586-7b4c3c1c0510-kube-api-access-2ph9l\") pod \"54c33a8c-623a-409c-8586-7b4c3c1c0510\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.884524 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-catalog-content\") pod \"54c33a8c-623a-409c-8586-7b4c3c1c0510\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.884676 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-utilities\") pod \"54c33a8c-623a-409c-8586-7b4c3c1c0510\" (UID: \"54c33a8c-623a-409c-8586-7b4c3c1c0510\") " Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.885700 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-utilities" (OuterVolumeSpecName: "utilities") pod "54c33a8c-623a-409c-8586-7b4c3c1c0510" (UID: "54c33a8c-623a-409c-8586-7b4c3c1c0510"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.891252 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c33a8c-623a-409c-8586-7b4c3c1c0510-kube-api-access-2ph9l" (OuterVolumeSpecName: "kube-api-access-2ph9l") pod "54c33a8c-623a-409c-8586-7b4c3c1c0510" (UID: "54c33a8c-623a-409c-8586-7b4c3c1c0510"). InnerVolumeSpecName "kube-api-access-2ph9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.908005 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54c33a8c-623a-409c-8586-7b4c3c1c0510" (UID: "54c33a8c-623a-409c-8586-7b4c3c1c0510"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.988601 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ph9l\" (UniqueName: \"kubernetes.io/projected/54c33a8c-623a-409c-8586-7b4c3c1c0510-kube-api-access-2ph9l\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.988656 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:23 crc kubenswrapper[4886]: I0129 17:50:23.988676 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c33a8c-623a-409c-8586-7b4c3c1c0510-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.185244 4886 generic.go:334] "Generic (PLEG): container finished" podID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerID="27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45" exitCode=0 Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.185351 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrwxm" event={"ID":"54c33a8c-623a-409c-8586-7b4c3c1c0510","Type":"ContainerDied","Data":"27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45"} Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.185445 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrwxm" event={"ID":"54c33a8c-623a-409c-8586-7b4c3c1c0510","Type":"ContainerDied","Data":"cef715b4d1f263de7fa710e1aeabb63fa13c49ab6f66feea0bfb2c6c3415b7ca"} Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.185475 4886 scope.go:117] "RemoveContainer" containerID="27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.185481 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7bw7c" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="registry-server" containerID="cri-o://bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba" gracePeriod=2 Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.185835 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrwxm" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.222029 4886 scope.go:117] "RemoveContainer" containerID="a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.227474 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrwxm"] Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.245667 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrwxm"] Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.265915 4886 scope.go:117] "RemoveContainer" containerID="8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.462113 4886 scope.go:117] "RemoveContainer" containerID="27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45" Jan 29 17:50:24 crc kubenswrapper[4886]: E0129 17:50:24.462677 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45\": container with ID starting with 27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45 not found: ID does not exist" containerID="27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.462730 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45"} err="failed to get container status \"27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45\": rpc error: code = NotFound desc = could not find container \"27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45\": container with ID starting with 27f046a674100ab834a75f639ec3d0dce4924491f8bfcffb76b533e0fac55c45 not found: ID does not exist" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.462760 4886 scope.go:117] "RemoveContainer" containerID="a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a" Jan 29 17:50:24 crc kubenswrapper[4886]: E0129 17:50:24.465800 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a\": container with ID starting with a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a not found: ID does not exist" containerID="a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.465858 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a"} err="failed to get container status \"a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a\": rpc error: code = NotFound desc = could not find container \"a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a\": container with ID starting with a6282dc7f559738f374de038d372b30a6cdc01fff3a49010d814fb2959bb189a not found: ID does not exist" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.465881 4886 scope.go:117] "RemoveContainer" containerID="8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c" Jan 29 17:50:24 crc kubenswrapper[4886]: E0129 17:50:24.466237 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c\": container with ID starting with 8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c not found: ID does not exist" containerID="8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.466304 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c"} err="failed to get container status \"8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c\": rpc error: code = NotFound desc = could not find container \"8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c\": container with ID starting with 8c89c20de1b6c1aa5e210e0a36da94a9f8bda518322088c323c4b12e10362b1c not found: ID does not exist" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.628248 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" path="/var/lib/kubelet/pods/54c33a8c-623a-409c-8586-7b4c3c1c0510/volumes" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.773689 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.925976 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d2ph\" (UniqueName: \"kubernetes.io/projected/c566a66d-f66d-457d-80eb-a0cf5bf4e013-kube-api-access-9d2ph\") pod \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.926093 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-catalog-content\") pod \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.926162 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-utilities\") pod \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\" (UID: \"c566a66d-f66d-457d-80eb-a0cf5bf4e013\") " Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.928225 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-utilities" (OuterVolumeSpecName: "utilities") pod "c566a66d-f66d-457d-80eb-a0cf5bf4e013" (UID: "c566a66d-f66d-457d-80eb-a0cf5bf4e013"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:50:24 crc kubenswrapper[4886]: I0129 17:50:24.941675 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c566a66d-f66d-457d-80eb-a0cf5bf4e013-kube-api-access-9d2ph" (OuterVolumeSpecName: "kube-api-access-9d2ph") pod "c566a66d-f66d-457d-80eb-a0cf5bf4e013" (UID: "c566a66d-f66d-457d-80eb-a0cf5bf4e013"). InnerVolumeSpecName "kube-api-access-9d2ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.029077 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d2ph\" (UniqueName: \"kubernetes.io/projected/c566a66d-f66d-457d-80eb-a0cf5bf4e013-kube-api-access-9d2ph\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.029113 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.103475 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c566a66d-f66d-457d-80eb-a0cf5bf4e013" (UID: "c566a66d-f66d-457d-80eb-a0cf5bf4e013"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.132014 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c566a66d-f66d-457d-80eb-a0cf5bf4e013-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.201232 4886 generic.go:334] "Generic (PLEG): container finished" podID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerID="bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba" exitCode=0 Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.201279 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bw7c" event={"ID":"c566a66d-f66d-457d-80eb-a0cf5bf4e013","Type":"ContainerDied","Data":"bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba"} Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.201349 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bw7c" event={"ID":"c566a66d-f66d-457d-80eb-a0cf5bf4e013","Type":"ContainerDied","Data":"cab69af52cd3a4f3f325f6b78803a593e82fd270c10956a862ec4c1b3df6eb47"} Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.201347 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bw7c" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.201377 4886 scope.go:117] "RemoveContainer" containerID="bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.233639 4886 scope.go:117] "RemoveContainer" containerID="048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.250964 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7bw7c"] Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.269243 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7bw7c"] Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.281230 4886 scope.go:117] "RemoveContainer" containerID="31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.304973 4886 scope.go:117] "RemoveContainer" containerID="bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba" Jan 29 17:50:25 crc kubenswrapper[4886]: E0129 17:50:25.305689 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba\": container with ID starting with bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba not found: ID does not exist" containerID="bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.305764 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba"} err="failed to get container status \"bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba\": rpc error: code = NotFound desc = could not find container \"bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba\": container with ID starting with bc86b5548a2f6b98575b342d99a002bfb0143807c9dd174f5af50b3baca239ba not found: ID does not exist" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.305809 4886 scope.go:117] "RemoveContainer" containerID="048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0" Jan 29 17:50:25 crc kubenswrapper[4886]: E0129 17:50:25.306347 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0\": container with ID starting with 048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0 not found: ID does not exist" containerID="048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.306395 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0"} err="failed to get container status \"048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0\": rpc error: code = NotFound desc = could not find container \"048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0\": container with ID starting with 048fbc3f19e9f2bb3a22233ff84755a02d78dda5d7adaf81250ada584b2655f0 not found: ID does not exist" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.306448 4886 scope.go:117] "RemoveContainer" containerID="31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace" Jan 29 17:50:25 crc kubenswrapper[4886]: E0129 17:50:25.306884 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace\": container with ID starting with 31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace not found: ID does not exist" containerID="31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace" Jan 29 17:50:25 crc kubenswrapper[4886]: I0129 17:50:25.307064 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace"} err="failed to get container status \"31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace\": rpc error: code = NotFound desc = could not find container \"31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace\": container with ID starting with 31280720311a3cf46c0d281650fde637fb00d0bd369f8b6e628ebaffb4d39ace not found: ID does not exist" Jan 29 17:50:26 crc kubenswrapper[4886]: I0129 17:50:26.633428 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" path="/var/lib/kubelet/pods/c566a66d-f66d-457d-80eb-a0cf5bf4e013/volumes" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.485718 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6tlwv"] Jan 29 17:50:27 crc kubenswrapper[4886]: E0129 17:50:27.487194 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="extract-utilities" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.487244 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="extract-utilities" Jan 29 17:50:27 crc kubenswrapper[4886]: E0129 17:50:27.487288 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerName="registry-server" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.487302 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerName="registry-server" Jan 29 17:50:27 crc kubenswrapper[4886]: E0129 17:50:27.487364 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerName="extract-utilities" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.487378 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerName="extract-utilities" Jan 29 17:50:27 crc kubenswrapper[4886]: E0129 17:50:27.487426 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerName="extract-content" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.487438 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerName="extract-content" Jan 29 17:50:27 crc kubenswrapper[4886]: E0129 17:50:27.487453 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="registry-server" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.487464 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="registry-server" Jan 29 17:50:27 crc kubenswrapper[4886]: E0129 17:50:27.487493 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="extract-content" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.487506 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="extract-content" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.487943 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c566a66d-f66d-457d-80eb-a0cf5bf4e013" containerName="registry-server" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.488009 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c33a8c-623a-409c-8586-7b4c3c1c0510" containerName="registry-server" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.492052 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.500007 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6tlwv"] Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.632033 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h94ld\" (UniqueName: \"kubernetes.io/projected/23a03da1-7fa0-41f6-b906-4769ab664bc5-kube-api-access-h94ld\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.632382 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-catalog-content\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.632675 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-utilities\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.734933 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-utilities\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.735003 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h94ld\" (UniqueName: \"kubernetes.io/projected/23a03da1-7fa0-41f6-b906-4769ab664bc5-kube-api-access-h94ld\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.735094 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-catalog-content\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.735544 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-utilities\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.735696 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-catalog-content\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.754141 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h94ld\" (UniqueName: \"kubernetes.io/projected/23a03da1-7fa0-41f6-b906-4769ab664bc5-kube-api-access-h94ld\") pod \"redhat-operators-6tlwv\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:27 crc kubenswrapper[4886]: I0129 17:50:27.851656 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:28 crc kubenswrapper[4886]: I0129 17:50:28.334487 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6tlwv"] Jan 29 17:50:29 crc kubenswrapper[4886]: I0129 17:50:29.260058 4886 generic.go:334] "Generic (PLEG): container finished" podID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerID="bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb" exitCode=0 Jan 29 17:50:29 crc kubenswrapper[4886]: I0129 17:50:29.260401 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tlwv" event={"ID":"23a03da1-7fa0-41f6-b906-4769ab664bc5","Type":"ContainerDied","Data":"bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb"} Jan 29 17:50:29 crc kubenswrapper[4886]: I0129 17:50:29.260440 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tlwv" event={"ID":"23a03da1-7fa0-41f6-b906-4769ab664bc5","Type":"ContainerStarted","Data":"a202a48d002122e515252ad53e71cce754e31eedf1ab1c6214ecb88c2058cfde"} Jan 29 17:50:30 crc kubenswrapper[4886]: I0129 17:50:30.617649 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:50:30 crc kubenswrapper[4886]: E0129 17:50:30.618271 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:50:31 crc kubenswrapper[4886]: I0129 17:50:31.281394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tlwv" event={"ID":"23a03da1-7fa0-41f6-b906-4769ab664bc5","Type":"ContainerStarted","Data":"f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf"} Jan 29 17:50:32 crc kubenswrapper[4886]: E0129 17:50:32.620743 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:50:37 crc kubenswrapper[4886]: I0129 17:50:37.367237 4886 generic.go:334] "Generic (PLEG): container finished" podID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerID="f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf" exitCode=0 Jan 29 17:50:37 crc kubenswrapper[4886]: I0129 17:50:37.367352 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tlwv" event={"ID":"23a03da1-7fa0-41f6-b906-4769ab664bc5","Type":"ContainerDied","Data":"f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf"} Jan 29 17:50:38 crc kubenswrapper[4886]: I0129 17:50:38.382718 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tlwv" event={"ID":"23a03da1-7fa0-41f6-b906-4769ab664bc5","Type":"ContainerStarted","Data":"8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c"} Jan 29 17:50:38 crc kubenswrapper[4886]: I0129 17:50:38.415855 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6tlwv" podStartSLOduration=2.634675891 podStartE2EDuration="11.415828356s" podCreationTimestamp="2026-01-29 17:50:27 +0000 UTC" firstStartedPulling="2026-01-29 17:50:29.263228123 +0000 UTC m=+5312.171947435" lastFinishedPulling="2026-01-29 17:50:38.044380608 +0000 UTC m=+5320.953099900" observedRunningTime="2026-01-29 17:50:38.410567557 +0000 UTC m=+5321.319286859" watchObservedRunningTime="2026-01-29 17:50:38.415828356 +0000 UTC m=+5321.324547668" Jan 29 17:50:41 crc kubenswrapper[4886]: I0129 17:50:41.615686 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:50:41 crc kubenswrapper[4886]: E0129 17:50:41.616814 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:50:47 crc kubenswrapper[4886]: E0129 17:50:47.622519 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:50:47 crc kubenswrapper[4886]: I0129 17:50:47.852887 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:47 crc kubenswrapper[4886]: I0129 17:50:47.852982 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:48 crc kubenswrapper[4886]: I0129 17:50:48.641067 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:48 crc kubenswrapper[4886]: I0129 17:50:48.699433 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:49 crc kubenswrapper[4886]: I0129 17:50:49.083364 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6tlwv"] Jan 29 17:50:50 crc kubenswrapper[4886]: I0129 17:50:50.539732 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6tlwv" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerName="registry-server" containerID="cri-o://8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c" gracePeriod=2 Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.154106 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.264129 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-catalog-content\") pod \"23a03da1-7fa0-41f6-b906-4769ab664bc5\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.264640 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h94ld\" (UniqueName: \"kubernetes.io/projected/23a03da1-7fa0-41f6-b906-4769ab664bc5-kube-api-access-h94ld\") pod \"23a03da1-7fa0-41f6-b906-4769ab664bc5\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.266527 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-utilities\") pod \"23a03da1-7fa0-41f6-b906-4769ab664bc5\" (UID: \"23a03da1-7fa0-41f6-b906-4769ab664bc5\") " Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.268677 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-utilities" (OuterVolumeSpecName: "utilities") pod "23a03da1-7fa0-41f6-b906-4769ab664bc5" (UID: "23a03da1-7fa0-41f6-b906-4769ab664bc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.271929 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a03da1-7fa0-41f6-b906-4769ab664bc5-kube-api-access-h94ld" (OuterVolumeSpecName: "kube-api-access-h94ld") pod "23a03da1-7fa0-41f6-b906-4769ab664bc5" (UID: "23a03da1-7fa0-41f6-b906-4769ab664bc5"). InnerVolumeSpecName "kube-api-access-h94ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.371225 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h94ld\" (UniqueName: \"kubernetes.io/projected/23a03da1-7fa0-41f6-b906-4769ab664bc5-kube-api-access-h94ld\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.371256 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.397224 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23a03da1-7fa0-41f6-b906-4769ab664bc5" (UID: "23a03da1-7fa0-41f6-b906-4769ab664bc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.476888 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a03da1-7fa0-41f6-b906-4769ab664bc5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.550924 4886 generic.go:334] "Generic (PLEG): container finished" podID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerID="8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c" exitCode=0 Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.550965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tlwv" event={"ID":"23a03da1-7fa0-41f6-b906-4769ab664bc5","Type":"ContainerDied","Data":"8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c"} Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.550994 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tlwv" event={"ID":"23a03da1-7fa0-41f6-b906-4769ab664bc5","Type":"ContainerDied","Data":"a202a48d002122e515252ad53e71cce754e31eedf1ab1c6214ecb88c2058cfde"} Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.551009 4886 scope.go:117] "RemoveContainer" containerID="8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.551032 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tlwv" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.593991 4886 scope.go:117] "RemoveContainer" containerID="f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.616156 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6tlwv"] Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.620006 4886 scope.go:117] "RemoveContainer" containerID="bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.627924 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6tlwv"] Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.692770 4886 scope.go:117] "RemoveContainer" containerID="8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c" Jan 29 17:50:51 crc kubenswrapper[4886]: E0129 17:50:51.693427 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c\": container with ID starting with 8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c not found: ID does not exist" containerID="8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.693497 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c"} err="failed to get container status \"8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c\": rpc error: code = NotFound desc = could not find container \"8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c\": container with ID starting with 8df92aae9f8620b2d061beecbaf2f5bce72758e4828378f15af511a411ef6e6c not found: ID does not exist" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.693546 4886 scope.go:117] "RemoveContainer" containerID="f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf" Jan 29 17:50:51 crc kubenswrapper[4886]: E0129 17:50:51.694091 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf\": container with ID starting with f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf not found: ID does not exist" containerID="f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.694122 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf"} err="failed to get container status \"f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf\": rpc error: code = NotFound desc = could not find container \"f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf\": container with ID starting with f0631425ddab1323041e7ecf7489d9c47b65f44f8f52eae86f1730126b411aaf not found: ID does not exist" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.694141 4886 scope.go:117] "RemoveContainer" containerID="bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb" Jan 29 17:50:51 crc kubenswrapper[4886]: E0129 17:50:51.694672 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb\": container with ID starting with bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb not found: ID does not exist" containerID="bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb" Jan 29 17:50:51 crc kubenswrapper[4886]: I0129 17:50:51.694737 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb"} err="failed to get container status \"bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb\": rpc error: code = NotFound desc = could not find container \"bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb\": container with ID starting with bcc6a4ee143fa849dab16d564a7897d3593761bb8a9147e60f4b959298b059fb not found: ID does not exist" Jan 29 17:50:52 crc kubenswrapper[4886]: I0129 17:50:52.630342 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" path="/var/lib/kubelet/pods/23a03da1-7fa0-41f6-b906-4769ab664bc5/volumes" Jan 29 17:50:53 crc kubenswrapper[4886]: I0129 17:50:53.615615 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:50:53 crc kubenswrapper[4886]: E0129 17:50:53.616566 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:50:58 crc kubenswrapper[4886]: E0129 17:50:58.636869 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:51:07 crc kubenswrapper[4886]: I0129 17:51:07.616119 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:51:07 crc kubenswrapper[4886]: E0129 17:51:07.617173 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:51:13 crc kubenswrapper[4886]: E0129 17:51:13.761822 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:51:13 crc kubenswrapper[4886]: E0129 17:51:13.762460 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlxp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qsjfd_openshift-marketplace(7ceed770-f253-4044-92f0-c8a07b89b621): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:51:13 crc kubenswrapper[4886]: E0129 17:51:13.763743 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:51:19 crc kubenswrapper[4886]: I0129 17:51:19.615416 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:51:19 crc kubenswrapper[4886]: E0129 17:51:19.616302 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:51:27 crc kubenswrapper[4886]: E0129 17:51:27.618964 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:51:32 crc kubenswrapper[4886]: I0129 17:51:32.615515 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:51:32 crc kubenswrapper[4886]: E0129 17:51:32.616540 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:51:38 crc kubenswrapper[4886]: E0129 17:51:38.631432 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:51:44 crc kubenswrapper[4886]: I0129 17:51:44.616507 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:51:44 crc kubenswrapper[4886]: E0129 17:51:44.617874 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:51:51 crc kubenswrapper[4886]: E0129 17:51:51.617816 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:51:57 crc kubenswrapper[4886]: I0129 17:51:57.617037 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:51:57 crc kubenswrapper[4886]: E0129 17:51:57.618500 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:52:03 crc kubenswrapper[4886]: E0129 17:52:03.620598 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:52:10 crc kubenswrapper[4886]: I0129 17:52:10.626064 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:52:10 crc kubenswrapper[4886]: E0129 17:52:10.629793 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:52:17 crc kubenswrapper[4886]: E0129 17:52:17.619465 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:52:23 crc kubenswrapper[4886]: I0129 17:52:23.620434 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:52:23 crc kubenswrapper[4886]: E0129 17:52:23.621732 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:52:28 crc kubenswrapper[4886]: E0129 17:52:28.632050 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:52:38 crc kubenswrapper[4886]: I0129 17:52:38.627917 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:52:38 crc kubenswrapper[4886]: E0129 17:52:38.629207 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:52:41 crc kubenswrapper[4886]: E0129 17:52:41.618806 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:52:51 crc kubenswrapper[4886]: I0129 17:52:51.616259 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:52:51 crc kubenswrapper[4886]: E0129 17:52:51.617442 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:52:54 crc kubenswrapper[4886]: E0129 17:52:54.620919 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:53:02 crc kubenswrapper[4886]: I0129 17:53:02.615643 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:53:02 crc kubenswrapper[4886]: E0129 17:53:02.616645 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:53:09 crc kubenswrapper[4886]: E0129 17:53:09.619564 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:53:16 crc kubenswrapper[4886]: I0129 17:53:16.615824 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:53:16 crc kubenswrapper[4886]: E0129 17:53:16.617608 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:53:23 crc kubenswrapper[4886]: E0129 17:53:23.620075 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:53:29 crc kubenswrapper[4886]: I0129 17:53:29.615349 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:53:29 crc kubenswrapper[4886]: E0129 17:53:29.615970 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 17:53:36 crc kubenswrapper[4886]: E0129 17:53:36.619857 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:53:42 crc kubenswrapper[4886]: I0129 17:53:42.614947 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:53:43 crc kubenswrapper[4886]: I0129 17:53:43.814960 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"fa1f6ca4f64abfca286935b5cea47f9bd94b19d5dd8d9a7d6d366866d5a4fa94"} Jan 29 17:53:51 crc kubenswrapper[4886]: E0129 17:53:51.618850 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:54:02 crc kubenswrapper[4886]: E0129 17:54:02.619956 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:54:16 crc kubenswrapper[4886]: E0129 17:54:16.619631 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:54:29 crc kubenswrapper[4886]: I0129 17:54:29.193867 4886 trace.go:236] Trace[2044250100]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-1" (29-Jan-2026 17:54:28.184) (total time: 1009ms): Jan 29 17:54:29 crc kubenswrapper[4886]: Trace[2044250100]: [1.009713838s] [1.009713838s] END Jan 29 17:54:29 crc kubenswrapper[4886]: E0129 17:54:29.618662 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:54:40 crc kubenswrapper[4886]: E0129 17:54:40.622130 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:54:52 crc kubenswrapper[4886]: E0129 17:54:52.627982 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:55:03 crc kubenswrapper[4886]: E0129 17:55:03.619717 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:55:17 crc kubenswrapper[4886]: E0129 17:55:17.618022 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:55:31 crc kubenswrapper[4886]: E0129 17:55:31.618667 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:55:45 crc kubenswrapper[4886]: E0129 17:55:45.621199 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:55:58 crc kubenswrapper[4886]: E0129 17:55:58.633195 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:55:59 crc kubenswrapper[4886]: I0129 17:55:59.661434 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:55:59 crc kubenswrapper[4886]: I0129 17:55:59.661526 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:56:11 crc kubenswrapper[4886]: E0129 17:56:11.618266 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:56:24 crc kubenswrapper[4886]: I0129 17:56:24.620884 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 17:56:24 crc kubenswrapper[4886]: E0129 17:56:24.761270 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 17:56:24 crc kubenswrapper[4886]: E0129 17:56:24.761459 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlxp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qsjfd_openshift-marketplace(7ceed770-f253-4044-92f0-c8a07b89b621): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 17:56:24 crc kubenswrapper[4886]: E0129 17:56:24.762642 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:56:29 crc kubenswrapper[4886]: I0129 17:56:29.661117 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:56:29 crc kubenswrapper[4886]: I0129 17:56:29.661716 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:56:38 crc kubenswrapper[4886]: E0129 17:56:38.634555 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:56:49 crc kubenswrapper[4886]: E0129 17:56:49.617322 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:56:59 crc kubenswrapper[4886]: I0129 17:56:59.661389 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:56:59 crc kubenswrapper[4886]: I0129 17:56:59.661941 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:56:59 crc kubenswrapper[4886]: I0129 17:56:59.661991 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 17:56:59 crc kubenswrapper[4886]: I0129 17:56:59.662917 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa1f6ca4f64abfca286935b5cea47f9bd94b19d5dd8d9a7d6d366866d5a4fa94"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 17:56:59 crc kubenswrapper[4886]: I0129 17:56:59.662979 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://fa1f6ca4f64abfca286935b5cea47f9bd94b19d5dd8d9a7d6d366866d5a4fa94" gracePeriod=600 Jan 29 17:57:00 crc kubenswrapper[4886]: I0129 17:57:00.385914 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="fa1f6ca4f64abfca286935b5cea47f9bd94b19d5dd8d9a7d6d366866d5a4fa94" exitCode=0 Jan 29 17:57:00 crc kubenswrapper[4886]: I0129 17:57:00.386008 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"fa1f6ca4f64abfca286935b5cea47f9bd94b19d5dd8d9a7d6d366866d5a4fa94"} Jan 29 17:57:00 crc kubenswrapper[4886]: I0129 17:57:00.386307 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a"} Jan 29 17:57:00 crc kubenswrapper[4886]: I0129 17:57:00.386367 4886 scope.go:117] "RemoveContainer" containerID="8f37486cd564f3c9ff31aeb674510c8a56e76898f95a0396c83ca3b24bffcac3" Jan 29 17:57:01 crc kubenswrapper[4886]: E0129 17:57:01.617357 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:57:14 crc kubenswrapper[4886]: E0129 17:57:14.620668 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:57:27 crc kubenswrapper[4886]: E0129 17:57:27.618681 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:57:41 crc kubenswrapper[4886]: E0129 17:57:41.618087 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:57:55 crc kubenswrapper[4886]: E0129 17:57:55.617752 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:58:10 crc kubenswrapper[4886]: E0129 17:58:10.617110 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:58:24 crc kubenswrapper[4886]: E0129 17:58:24.617720 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:58:38 crc kubenswrapper[4886]: E0129 17:58:38.638611 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:58:52 crc kubenswrapper[4886]: E0129 17:58:52.618895 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:59:05 crc kubenswrapper[4886]: E0129 17:59:05.617535 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:59:17 crc kubenswrapper[4886]: E0129 17:59:17.619974 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:59:28 crc kubenswrapper[4886]: E0129 17:59:28.635258 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:59:29 crc kubenswrapper[4886]: I0129 17:59:29.661131 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:59:29 crc kubenswrapper[4886]: I0129 17:59:29.661592 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 17:59:40 crc kubenswrapper[4886]: E0129 17:59:40.623968 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:59:51 crc kubenswrapper[4886]: E0129 17:59:51.618289 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 17:59:59 crc kubenswrapper[4886]: I0129 17:59:59.660631 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 17:59:59 crc kubenswrapper[4886]: I0129 17:59:59.661380 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.180701 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk"] Jan 29 18:00:00 crc kubenswrapper[4886]: E0129 18:00:00.181785 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerName="extract-content" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.181879 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerName="extract-content" Jan 29 18:00:00 crc kubenswrapper[4886]: E0129 18:00:00.181967 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerName="registry-server" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.182035 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerName="registry-server" Jan 29 18:00:00 crc kubenswrapper[4886]: E0129 18:00:00.182124 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerName="extract-utilities" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.182197 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerName="extract-utilities" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.182626 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a03da1-7fa0-41f6-b906-4769ab664bc5" containerName="registry-server" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.183735 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.193142 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk"] Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.216756 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.216765 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.221186 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d5d488e-61ed-4dc1-b209-0d4c90eac204-config-volume\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.221392 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d5d488e-61ed-4dc1-b209-0d4c90eac204-secret-volume\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.221467 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv8rn\" (UniqueName: \"kubernetes.io/projected/7d5d488e-61ed-4dc1-b209-0d4c90eac204-kube-api-access-dv8rn\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.323719 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d5d488e-61ed-4dc1-b209-0d4c90eac204-secret-volume\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.323813 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv8rn\" (UniqueName: \"kubernetes.io/projected/7d5d488e-61ed-4dc1-b209-0d4c90eac204-kube-api-access-dv8rn\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.323932 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d5d488e-61ed-4dc1-b209-0d4c90eac204-config-volume\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.325450 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d5d488e-61ed-4dc1-b209-0d4c90eac204-config-volume\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.344202 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d5d488e-61ed-4dc1-b209-0d4c90eac204-secret-volume\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.344753 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv8rn\" (UniqueName: \"kubernetes.io/projected/7d5d488e-61ed-4dc1-b209-0d4c90eac204-kube-api-access-dv8rn\") pod \"collect-profiles-29495160-89jtk\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:00 crc kubenswrapper[4886]: I0129 18:00:00.537788 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:01 crc kubenswrapper[4886]: I0129 18:00:01.090259 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk"] Jan 29 18:00:01 crc kubenswrapper[4886]: W0129 18:00:01.098820 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d5d488e_61ed_4dc1_b209_0d4c90eac204.slice/crio-a95a2ce75d3c522077c1b1011fca397ce7bf0005a75c52b7823917a27fd7f83d WatchSource:0}: Error finding container a95a2ce75d3c522077c1b1011fca397ce7bf0005a75c52b7823917a27fd7f83d: Status 404 returned error can't find the container with id a95a2ce75d3c522077c1b1011fca397ce7bf0005a75c52b7823917a27fd7f83d Jan 29 18:00:01 crc kubenswrapper[4886]: I0129 18:00:01.607849 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" event={"ID":"7d5d488e-61ed-4dc1-b209-0d4c90eac204","Type":"ContainerStarted","Data":"a3a5dd3d496c3ee55fe9582f3f1eae23dc0082c3dc8857b9c954a351ccc720bb"} Jan 29 18:00:01 crc kubenswrapper[4886]: I0129 18:00:01.608204 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" event={"ID":"7d5d488e-61ed-4dc1-b209-0d4c90eac204","Type":"ContainerStarted","Data":"a95a2ce75d3c522077c1b1011fca397ce7bf0005a75c52b7823917a27fd7f83d"} Jan 29 18:00:02 crc kubenswrapper[4886]: E0129 18:00:02.616636 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 18:00:02 crc kubenswrapper[4886]: I0129 18:00:02.664549 4886 generic.go:334] "Generic (PLEG): container finished" podID="7d5d488e-61ed-4dc1-b209-0d4c90eac204" containerID="a3a5dd3d496c3ee55fe9582f3f1eae23dc0082c3dc8857b9c954a351ccc720bb" exitCode=0 Jan 29 18:00:02 crc kubenswrapper[4886]: I0129 18:00:02.664625 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" event={"ID":"7d5d488e-61ed-4dc1-b209-0d4c90eac204","Type":"ContainerDied","Data":"a3a5dd3d496c3ee55fe9582f3f1eae23dc0082c3dc8857b9c954a351ccc720bb"} Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.133653 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.225108 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d5d488e-61ed-4dc1-b209-0d4c90eac204-secret-volume\") pod \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.225284 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv8rn\" (UniqueName: \"kubernetes.io/projected/7d5d488e-61ed-4dc1-b209-0d4c90eac204-kube-api-access-dv8rn\") pod \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.225574 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d5d488e-61ed-4dc1-b209-0d4c90eac204-config-volume\") pod \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\" (UID: \"7d5d488e-61ed-4dc1-b209-0d4c90eac204\") " Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.226817 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5d488e-61ed-4dc1-b209-0d4c90eac204-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d5d488e-61ed-4dc1-b209-0d4c90eac204" (UID: "7d5d488e-61ed-4dc1-b209-0d4c90eac204"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.253662 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5d488e-61ed-4dc1-b209-0d4c90eac204-kube-api-access-dv8rn" (OuterVolumeSpecName: "kube-api-access-dv8rn") pod "7d5d488e-61ed-4dc1-b209-0d4c90eac204" (UID: "7d5d488e-61ed-4dc1-b209-0d4c90eac204"). InnerVolumeSpecName "kube-api-access-dv8rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.287056 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5d488e-61ed-4dc1-b209-0d4c90eac204-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d5d488e-61ed-4dc1-b209-0d4c90eac204" (UID: "7d5d488e-61ed-4dc1-b209-0d4c90eac204"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.329202 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d5d488e-61ed-4dc1-b209-0d4c90eac204-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.329229 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d5d488e-61ed-4dc1-b209-0d4c90eac204-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.329239 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv8rn\" (UniqueName: \"kubernetes.io/projected/7d5d488e-61ed-4dc1-b209-0d4c90eac204-kube-api-access-dv8rn\") on node \"crc\" DevicePath \"\"" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.692306 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" event={"ID":"7d5d488e-61ed-4dc1-b209-0d4c90eac204","Type":"ContainerDied","Data":"a95a2ce75d3c522077c1b1011fca397ce7bf0005a75c52b7823917a27fd7f83d"} Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.692387 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a95a2ce75d3c522077c1b1011fca397ce7bf0005a75c52b7823917a27fd7f83d" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.692471 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495160-89jtk" Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.727309 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz"] Jan 29 18:00:04 crc kubenswrapper[4886]: I0129 18:00:04.739367 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495115-pkxcz"] Jan 29 18:00:06 crc kubenswrapper[4886]: I0129 18:00:06.634621 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875b9b50-c440-4567-b475-c890d3d5d713" path="/var/lib/kubelet/pods/875b9b50-c440-4567-b475-c890d3d5d713/volumes" Jan 29 18:00:17 crc kubenswrapper[4886]: E0129 18:00:17.618868 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.021681 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-58r66"] Jan 29 18:00:22 crc kubenswrapper[4886]: E0129 18:00:22.022853 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5d488e-61ed-4dc1-b209-0d4c90eac204" containerName="collect-profiles" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.022869 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5d488e-61ed-4dc1-b209-0d4c90eac204" containerName="collect-profiles" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.023150 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5d488e-61ed-4dc1-b209-0d4c90eac204" containerName="collect-profiles" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.025201 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.040886 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-58r66"] Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.120676 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pksbf\" (UniqueName: \"kubernetes.io/projected/26d3617b-467f-42e7-b171-2652f60e856a-kube-api-access-pksbf\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.120737 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-utilities\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.120913 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-catalog-content\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.222618 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-utilities\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.222786 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-catalog-content\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.222994 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pksbf\" (UniqueName: \"kubernetes.io/projected/26d3617b-467f-42e7-b171-2652f60e856a-kube-api-access-pksbf\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.223834 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-utilities\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.224135 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-catalog-content\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.243486 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pksbf\" (UniqueName: \"kubernetes.io/projected/26d3617b-467f-42e7-b171-2652f60e856a-kube-api-access-pksbf\") pod \"community-operators-58r66\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:22 crc kubenswrapper[4886]: I0129 18:00:22.369846 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:23 crc kubenswrapper[4886]: I0129 18:00:23.694060 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-58r66"] Jan 29 18:00:23 crc kubenswrapper[4886]: I0129 18:00:23.924876 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58r66" event={"ID":"26d3617b-467f-42e7-b171-2652f60e856a","Type":"ContainerStarted","Data":"35d7a80f0d4f24685099c1759ec7b05ca9d597f3a2a3871214d4945e075e4c55"} Jan 29 18:00:24 crc kubenswrapper[4886]: I0129 18:00:24.941472 4886 generic.go:334] "Generic (PLEG): container finished" podID="26d3617b-467f-42e7-b171-2652f60e856a" containerID="b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b" exitCode=0 Jan 29 18:00:24 crc kubenswrapper[4886]: I0129 18:00:24.941608 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58r66" event={"ID":"26d3617b-467f-42e7-b171-2652f60e856a","Type":"ContainerDied","Data":"b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b"} Jan 29 18:00:26 crc kubenswrapper[4886]: I0129 18:00:26.971595 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58r66" event={"ID":"26d3617b-467f-42e7-b171-2652f60e856a","Type":"ContainerStarted","Data":"ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148"} Jan 29 18:00:27 crc kubenswrapper[4886]: I0129 18:00:27.983009 4886 generic.go:334] "Generic (PLEG): container finished" podID="26d3617b-467f-42e7-b171-2652f60e856a" containerID="ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148" exitCode=0 Jan 29 18:00:27 crc kubenswrapper[4886]: I0129 18:00:27.983067 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58r66" event={"ID":"26d3617b-467f-42e7-b171-2652f60e856a","Type":"ContainerDied","Data":"ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148"} Jan 29 18:00:28 crc kubenswrapper[4886]: I0129 18:00:28.997549 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58r66" event={"ID":"26d3617b-467f-42e7-b171-2652f60e856a","Type":"ContainerStarted","Data":"e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f"} Jan 29 18:00:29 crc kubenswrapper[4886]: I0129 18:00:29.041622 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-58r66" podStartSLOduration=4.381319053 podStartE2EDuration="8.041600708s" podCreationTimestamp="2026-01-29 18:00:21 +0000 UTC" firstStartedPulling="2026-01-29 18:00:24.943826851 +0000 UTC m=+5907.852546133" lastFinishedPulling="2026-01-29 18:00:28.604108486 +0000 UTC m=+5911.512827788" observedRunningTime="2026-01-29 18:00:29.016393686 +0000 UTC m=+5911.925113028" watchObservedRunningTime="2026-01-29 18:00:29.041600708 +0000 UTC m=+5911.950319990" Jan 29 18:00:29 crc kubenswrapper[4886]: I0129 18:00:29.673370 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 18:00:29 crc kubenswrapper[4886]: I0129 18:00:29.673467 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 18:00:29 crc kubenswrapper[4886]: I0129 18:00:29.673541 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 18:00:29 crc kubenswrapper[4886]: I0129 18:00:29.685414 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 18:00:29 crc kubenswrapper[4886]: I0129 18:00:29.685542 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" gracePeriod=600 Jan 29 18:00:29 crc kubenswrapper[4886]: E0129 18:00:29.818636 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:00:30 crc kubenswrapper[4886]: I0129 18:00:30.011704 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" exitCode=0 Jan 29 18:00:30 crc kubenswrapper[4886]: I0129 18:00:30.011805 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a"} Jan 29 18:00:30 crc kubenswrapper[4886]: I0129 18:00:30.011889 4886 scope.go:117] "RemoveContainer" containerID="fa1f6ca4f64abfca286935b5cea47f9bd94b19d5dd8d9a7d6d366866d5a4fa94" Jan 29 18:00:30 crc kubenswrapper[4886]: I0129 18:00:30.013122 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:00:30 crc kubenswrapper[4886]: E0129 18:00:30.013828 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:00:31 crc kubenswrapper[4886]: E0129 18:00:31.617772 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 18:00:32 crc kubenswrapper[4886]: I0129 18:00:32.370561 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:32 crc kubenswrapper[4886]: I0129 18:00:32.370670 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:33 crc kubenswrapper[4886]: I0129 18:00:33.614830 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-58r66" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="registry-server" probeResult="failure" output=< Jan 29 18:00:33 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 18:00:33 crc kubenswrapper[4886]: > Jan 29 18:00:41 crc kubenswrapper[4886]: I0129 18:00:41.616001 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:00:41 crc kubenswrapper[4886]: E0129 18:00:41.617068 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:00:42 crc kubenswrapper[4886]: I0129 18:00:42.426830 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:42 crc kubenswrapper[4886]: I0129 18:00:42.477364 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:42 crc kubenswrapper[4886]: I0129 18:00:42.686763 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-58r66"] Jan 29 18:00:43 crc kubenswrapper[4886]: E0129 18:00:43.618929 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.188442 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-58r66" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="registry-server" containerID="cri-o://e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f" gracePeriod=2 Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.787231 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.909345 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-catalog-content\") pod \"26d3617b-467f-42e7-b171-2652f60e856a\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.909710 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-utilities\") pod \"26d3617b-467f-42e7-b171-2652f60e856a\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.909864 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pksbf\" (UniqueName: \"kubernetes.io/projected/26d3617b-467f-42e7-b171-2652f60e856a-kube-api-access-pksbf\") pod \"26d3617b-467f-42e7-b171-2652f60e856a\" (UID: \"26d3617b-467f-42e7-b171-2652f60e856a\") " Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.910440 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-utilities" (OuterVolumeSpecName: "utilities") pod "26d3617b-467f-42e7-b171-2652f60e856a" (UID: "26d3617b-467f-42e7-b171-2652f60e856a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.910647 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.922757 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d3617b-467f-42e7-b171-2652f60e856a-kube-api-access-pksbf" (OuterVolumeSpecName: "kube-api-access-pksbf") pod "26d3617b-467f-42e7-b171-2652f60e856a" (UID: "26d3617b-467f-42e7-b171-2652f60e856a"). InnerVolumeSpecName "kube-api-access-pksbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:00:44 crc kubenswrapper[4886]: I0129 18:00:44.987134 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26d3617b-467f-42e7-b171-2652f60e856a" (UID: "26d3617b-467f-42e7-b171-2652f60e856a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.012775 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pksbf\" (UniqueName: \"kubernetes.io/projected/26d3617b-467f-42e7-b171-2652f60e856a-kube-api-access-pksbf\") on node \"crc\" DevicePath \"\"" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.012817 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d3617b-467f-42e7-b171-2652f60e856a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.207092 4886 generic.go:334] "Generic (PLEG): container finished" podID="26d3617b-467f-42e7-b171-2652f60e856a" containerID="e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f" exitCode=0 Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.207178 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58r66" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.207200 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58r66" event={"ID":"26d3617b-467f-42e7-b171-2652f60e856a","Type":"ContainerDied","Data":"e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f"} Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.209348 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58r66" event={"ID":"26d3617b-467f-42e7-b171-2652f60e856a","Type":"ContainerDied","Data":"35d7a80f0d4f24685099c1759ec7b05ca9d597f3a2a3871214d4945e075e4c55"} Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.209384 4886 scope.go:117] "RemoveContainer" containerID="e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.235960 4886 scope.go:117] "RemoveContainer" containerID="ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.275030 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-58r66"] Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.284533 4886 scope.go:117] "RemoveContainer" containerID="b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.288139 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-58r66"] Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.330932 4886 scope.go:117] "RemoveContainer" containerID="e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f" Jan 29 18:00:45 crc kubenswrapper[4886]: E0129 18:00:45.334201 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f\": container with ID starting with e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f not found: ID does not exist" containerID="e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.334241 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f"} err="failed to get container status \"e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f\": rpc error: code = NotFound desc = could not find container \"e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f\": container with ID starting with e414a896b88f092e6432856ecb0c1b6b443cf48e31bdaac4980adf1ae5105d4f not found: ID does not exist" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.334273 4886 scope.go:117] "RemoveContainer" containerID="ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148" Jan 29 18:00:45 crc kubenswrapper[4886]: E0129 18:00:45.334926 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148\": container with ID starting with ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148 not found: ID does not exist" containerID="ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.334960 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148"} err="failed to get container status \"ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148\": rpc error: code = NotFound desc = could not find container \"ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148\": container with ID starting with ee399e1894cc907afbfb2f0a808f1ddd9a838c29f41e68661126447443043148 not found: ID does not exist" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.334995 4886 scope.go:117] "RemoveContainer" containerID="b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b" Jan 29 18:00:45 crc kubenswrapper[4886]: E0129 18:00:45.336293 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b\": container with ID starting with b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b not found: ID does not exist" containerID="b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b" Jan 29 18:00:45 crc kubenswrapper[4886]: I0129 18:00:45.336343 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b"} err="failed to get container status \"b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b\": rpc error: code = NotFound desc = could not find container \"b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b\": container with ID starting with b620a665001976e28d6625a514bd7e44772c65a9d80ded020ecca7162863f51b not found: ID does not exist" Jan 29 18:00:46 crc kubenswrapper[4886]: I0129 18:00:46.639782 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d3617b-467f-42e7-b171-2652f60e856a" path="/var/lib/kubelet/pods/26d3617b-467f-42e7-b171-2652f60e856a/volumes" Jan 29 18:00:54 crc kubenswrapper[4886]: E0129 18:00:54.620000 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 18:00:56 crc kubenswrapper[4886]: I0129 18:00:56.484509 4886 scope.go:117] "RemoveContainer" containerID="db3e3f16f0932c632a2ab1ffff0f92252979a66c9e52244934f9d97bdd89246b" Jan 29 18:00:56 crc kubenswrapper[4886]: I0129 18:00:56.615517 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:00:56 crc kubenswrapper[4886]: E0129 18:00:56.616282 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.179076 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29495161-tqptf"] Jan 29 18:01:00 crc kubenswrapper[4886]: E0129 18:01:00.180846 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="extract-content" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.180881 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="extract-content" Jan 29 18:01:00 crc kubenswrapper[4886]: E0129 18:01:00.180953 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="extract-utilities" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.180968 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="extract-utilities" Jan 29 18:01:00 crc kubenswrapper[4886]: E0129 18:01:00.181034 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="registry-server" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.181051 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="registry-server" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.181513 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d3617b-467f-42e7-b171-2652f60e856a" containerName="registry-server" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.183428 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.192014 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29495161-tqptf"] Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.365116 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-config-data\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.365170 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l694k\" (UniqueName: \"kubernetes.io/projected/62fe5584-12c8-4933-868d-bbb9e04f7bb3-kube-api-access-l694k\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.365307 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-fernet-keys\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.365421 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-combined-ca-bundle\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.468101 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-fernet-keys\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.468249 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-combined-ca-bundle\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.468301 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-config-data\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.468362 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l694k\" (UniqueName: \"kubernetes.io/projected/62fe5584-12c8-4933-868d-bbb9e04f7bb3-kube-api-access-l694k\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.477666 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-config-data\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.477959 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-combined-ca-bundle\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.478191 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-fernet-keys\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.499397 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l694k\" (UniqueName: \"kubernetes.io/projected/62fe5584-12c8-4933-868d-bbb9e04f7bb3-kube-api-access-l694k\") pod \"keystone-cron-29495161-tqptf\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:00 crc kubenswrapper[4886]: I0129 18:01:00.509318 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:01 crc kubenswrapper[4886]: I0129 18:01:01.053187 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29495161-tqptf"] Jan 29 18:01:01 crc kubenswrapper[4886]: W0129 18:01:01.053966 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62fe5584_12c8_4933_868d_bbb9e04f7bb3.slice/crio-b41604ad0f6d67f9c35086ac556ef6beebd1f2ec8853782909cd8f19ef4fd03a WatchSource:0}: Error finding container b41604ad0f6d67f9c35086ac556ef6beebd1f2ec8853782909cd8f19ef4fd03a: Status 404 returned error can't find the container with id b41604ad0f6d67f9c35086ac556ef6beebd1f2ec8853782909cd8f19ef4fd03a Jan 29 18:01:01 crc kubenswrapper[4886]: I0129 18:01:01.409600 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495161-tqptf" event={"ID":"62fe5584-12c8-4933-868d-bbb9e04f7bb3","Type":"ContainerStarted","Data":"f20811ea62519d50e5ec92d004c06f490a5ae492a283aa90b258514105a668e0"} Jan 29 18:01:01 crc kubenswrapper[4886]: I0129 18:01:01.409999 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495161-tqptf" event={"ID":"62fe5584-12c8-4933-868d-bbb9e04f7bb3","Type":"ContainerStarted","Data":"b41604ad0f6d67f9c35086ac556ef6beebd1f2ec8853782909cd8f19ef4fd03a"} Jan 29 18:01:01 crc kubenswrapper[4886]: I0129 18:01:01.435596 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29495161-tqptf" podStartSLOduration=1.435575421 podStartE2EDuration="1.435575421s" podCreationTimestamp="2026-01-29 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 18:01:01.428058629 +0000 UTC m=+5944.336777901" watchObservedRunningTime="2026-01-29 18:01:01.435575421 +0000 UTC m=+5944.344294693" Jan 29 18:01:05 crc kubenswrapper[4886]: I0129 18:01:05.454761 4886 generic.go:334] "Generic (PLEG): container finished" podID="62fe5584-12c8-4933-868d-bbb9e04f7bb3" containerID="f20811ea62519d50e5ec92d004c06f490a5ae492a283aa90b258514105a668e0" exitCode=0 Jan 29 18:01:05 crc kubenswrapper[4886]: I0129 18:01:05.454823 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495161-tqptf" event={"ID":"62fe5584-12c8-4933-868d-bbb9e04f7bb3","Type":"ContainerDied","Data":"f20811ea62519d50e5ec92d004c06f490a5ae492a283aa90b258514105a668e0"} Jan 29 18:01:06 crc kubenswrapper[4886]: I0129 18:01:06.955303 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.099492 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-combined-ca-bundle\") pod \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.099636 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-config-data\") pod \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.099686 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-fernet-keys\") pod \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.099804 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l694k\" (UniqueName: \"kubernetes.io/projected/62fe5584-12c8-4933-868d-bbb9e04f7bb3-kube-api-access-l694k\") pod \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\" (UID: \"62fe5584-12c8-4933-868d-bbb9e04f7bb3\") " Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.106634 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "62fe5584-12c8-4933-868d-bbb9e04f7bb3" (UID: "62fe5584-12c8-4933-868d-bbb9e04f7bb3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.106704 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fe5584-12c8-4933-868d-bbb9e04f7bb3-kube-api-access-l694k" (OuterVolumeSpecName: "kube-api-access-l694k") pod "62fe5584-12c8-4933-868d-bbb9e04f7bb3" (UID: "62fe5584-12c8-4933-868d-bbb9e04f7bb3"). InnerVolumeSpecName "kube-api-access-l694k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.137883 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62fe5584-12c8-4933-868d-bbb9e04f7bb3" (UID: "62fe5584-12c8-4933-868d-bbb9e04f7bb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.181492 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-config-data" (OuterVolumeSpecName: "config-data") pod "62fe5584-12c8-4933-868d-bbb9e04f7bb3" (UID: "62fe5584-12c8-4933-868d-bbb9e04f7bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.203260 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.203295 4886 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.203308 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l694k\" (UniqueName: \"kubernetes.io/projected/62fe5584-12c8-4933-868d-bbb9e04f7bb3-kube-api-access-l694k\") on node \"crc\" DevicePath \"\"" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.203336 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fe5584-12c8-4933-868d-bbb9e04f7bb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.483101 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495161-tqptf" event={"ID":"62fe5584-12c8-4933-868d-bbb9e04f7bb3","Type":"ContainerDied","Data":"b41604ad0f6d67f9c35086ac556ef6beebd1f2ec8853782909cd8f19ef4fd03a"} Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.483154 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41604ad0f6d67f9c35086ac556ef6beebd1f2ec8853782909cd8f19ef4fd03a" Jan 29 18:01:07 crc kubenswrapper[4886]: I0129 18:01:07.483169 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495161-tqptf" Jan 29 18:01:08 crc kubenswrapper[4886]: I0129 18:01:08.623674 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:01:08 crc kubenswrapper[4886]: E0129 18:01:08.624464 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:01:09 crc kubenswrapper[4886]: E0129 18:01:09.619689 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 18:01:20 crc kubenswrapper[4886]: I0129 18:01:20.615095 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:01:20 crc kubenswrapper[4886]: E0129 18:01:20.616057 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:01:22 crc kubenswrapper[4886]: E0129 18:01:22.620006 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" Jan 29 18:01:31 crc kubenswrapper[4886]: I0129 18:01:31.615899 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:01:31 crc kubenswrapper[4886]: E0129 18:01:31.616991 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:01:34 crc kubenswrapper[4886]: I0129 18:01:34.626503 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 18:01:35 crc kubenswrapper[4886]: I0129 18:01:35.821283 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsjfd" event={"ID":"7ceed770-f253-4044-92f0-c8a07b89b621","Type":"ContainerStarted","Data":"a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86"} Jan 29 18:01:37 crc kubenswrapper[4886]: I0129 18:01:37.848967 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ceed770-f253-4044-92f0-c8a07b89b621" containerID="a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86" exitCode=0 Jan 29 18:01:37 crc kubenswrapper[4886]: I0129 18:01:37.849080 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsjfd" event={"ID":"7ceed770-f253-4044-92f0-c8a07b89b621","Type":"ContainerDied","Data":"a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86"} Jan 29 18:01:38 crc kubenswrapper[4886]: I0129 18:01:38.861468 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsjfd" event={"ID":"7ceed770-f253-4044-92f0-c8a07b89b621","Type":"ContainerStarted","Data":"57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a"} Jan 29 18:01:42 crc kubenswrapper[4886]: I0129 18:01:42.615243 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:01:42 crc kubenswrapper[4886]: E0129 18:01:42.616064 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:01:44 crc kubenswrapper[4886]: I0129 18:01:44.839218 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 18:01:44 crc kubenswrapper[4886]: I0129 18:01:44.839589 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 18:01:44 crc kubenswrapper[4886]: I0129 18:01:44.903078 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 18:01:44 crc kubenswrapper[4886]: I0129 18:01:44.929498 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qsjfd" podStartSLOduration=8.561621441 podStartE2EDuration="21m30.929477222s" podCreationTimestamp="2026-01-29 17:40:14 +0000 UTC" firstStartedPulling="2026-01-29 17:40:15.942749248 +0000 UTC m=+4698.851468530" lastFinishedPulling="2026-01-29 18:01:38.310604999 +0000 UTC m=+5981.219324311" observedRunningTime="2026-01-29 18:01:38.892139234 +0000 UTC m=+5981.800858526" watchObservedRunningTime="2026-01-29 18:01:44.929477222 +0000 UTC m=+5987.838196504" Jan 29 18:01:45 crc kubenswrapper[4886]: I0129 18:01:45.009454 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 18:01:45 crc kubenswrapper[4886]: I0129 18:01:45.144497 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qsjfd"] Jan 29 18:01:46 crc kubenswrapper[4886]: I0129 18:01:46.951275 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qsjfd" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" containerName="registry-server" containerID="cri-o://57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a" gracePeriod=2 Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.435042 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.481414 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-catalog-content\") pod \"7ceed770-f253-4044-92f0-c8a07b89b621\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.482524 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-utilities\") pod \"7ceed770-f253-4044-92f0-c8a07b89b621\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.482658 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlxp8\" (UniqueName: \"kubernetes.io/projected/7ceed770-f253-4044-92f0-c8a07b89b621-kube-api-access-nlxp8\") pod \"7ceed770-f253-4044-92f0-c8a07b89b621\" (UID: \"7ceed770-f253-4044-92f0-c8a07b89b621\") " Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.483553 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-utilities" (OuterVolumeSpecName: "utilities") pod "7ceed770-f253-4044-92f0-c8a07b89b621" (UID: "7ceed770-f253-4044-92f0-c8a07b89b621"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.483917 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.488177 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ceed770-f253-4044-92f0-c8a07b89b621-kube-api-access-nlxp8" (OuterVolumeSpecName: "kube-api-access-nlxp8") pod "7ceed770-f253-4044-92f0-c8a07b89b621" (UID: "7ceed770-f253-4044-92f0-c8a07b89b621"). InnerVolumeSpecName "kube-api-access-nlxp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.539738 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ceed770-f253-4044-92f0-c8a07b89b621" (UID: "7ceed770-f253-4044-92f0-c8a07b89b621"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.586265 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlxp8\" (UniqueName: \"kubernetes.io/projected/7ceed770-f253-4044-92f0-c8a07b89b621-kube-api-access-nlxp8\") on node \"crc\" DevicePath \"\"" Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.586304 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ceed770-f253-4044-92f0-c8a07b89b621-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.964126 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ceed770-f253-4044-92f0-c8a07b89b621" containerID="57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a" exitCode=0 Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.964162 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsjfd" event={"ID":"7ceed770-f253-4044-92f0-c8a07b89b621","Type":"ContainerDied","Data":"57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a"} Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.964187 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsjfd" event={"ID":"7ceed770-f253-4044-92f0-c8a07b89b621","Type":"ContainerDied","Data":"fb5b6b721dd0a2050f48ef0e26fac1871e4ba7b7b47b95e41a00c0852ef2c55b"} Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.964204 4886 scope.go:117] "RemoveContainer" containerID="57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a" Jan 29 18:01:47 crc kubenswrapper[4886]: I0129 18:01:47.964388 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsjfd" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.001676 4886 scope.go:117] "RemoveContainer" containerID="a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.019394 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qsjfd"] Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.032436 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qsjfd"] Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.049462 4886 scope.go:117] "RemoveContainer" containerID="bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.099055 4886 scope.go:117] "RemoveContainer" containerID="57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a" Jan 29 18:01:48 crc kubenswrapper[4886]: E0129 18:01:48.101222 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a\": container with ID starting with 57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a not found: ID does not exist" containerID="57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.101286 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a"} err="failed to get container status \"57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a\": rpc error: code = NotFound desc = could not find container \"57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a\": container with ID starting with 57611bb9d4c88485f704785c6260beffdf3364717c2a0a0bf33dbfb1aa8bb69a not found: ID does not exist" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.101316 4886 scope.go:117] "RemoveContainer" containerID="a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86" Jan 29 18:01:48 crc kubenswrapper[4886]: E0129 18:01:48.101870 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86\": container with ID starting with a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86 not found: ID does not exist" containerID="a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.101908 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86"} err="failed to get container status \"a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86\": rpc error: code = NotFound desc = could not find container \"a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86\": container with ID starting with a26e038fba7b20c6bbd8f67983806ee67b86edda4f42bed3b1e5dc6e19691d86 not found: ID does not exist" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.101935 4886 scope.go:117] "RemoveContainer" containerID="bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b" Jan 29 18:01:48 crc kubenswrapper[4886]: E0129 18:01:48.103769 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b\": container with ID starting with bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b not found: ID does not exist" containerID="bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.103816 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b"} err="failed to get container status \"bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b\": rpc error: code = NotFound desc = could not find container \"bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b\": container with ID starting with bedb65e37127565b5119ee8d90f572bdf6b6802d26fcd6797bad10fc8e07c14b not found: ID does not exist" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.571466 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q5rzj"] Jan 29 18:01:48 crc kubenswrapper[4886]: E0129 18:01:48.572001 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" containerName="extract-utilities" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.572021 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" containerName="extract-utilities" Jan 29 18:01:48 crc kubenswrapper[4886]: E0129 18:01:48.572047 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" containerName="extract-content" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.572056 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" containerName="extract-content" Jan 29 18:01:48 crc kubenswrapper[4886]: E0129 18:01:48.572070 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" containerName="registry-server" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.572079 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" containerName="registry-server" Jan 29 18:01:48 crc kubenswrapper[4886]: E0129 18:01:48.572129 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fe5584-12c8-4933-868d-bbb9e04f7bb3" containerName="keystone-cron" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.572137 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fe5584-12c8-4933-868d-bbb9e04f7bb3" containerName="keystone-cron" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.572414 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="62fe5584-12c8-4933-868d-bbb9e04f7bb3" containerName="keystone-cron" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.572468 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" containerName="registry-server" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.574600 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.610885 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-catalog-content\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.611137 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mp79\" (UniqueName: \"kubernetes.io/projected/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-kube-api-access-5mp79\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.611186 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-utilities\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.626168 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ceed770-f253-4044-92f0-c8a07b89b621" path="/var/lib/kubelet/pods/7ceed770-f253-4044-92f0-c8a07b89b621/volumes" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.626885 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5rzj"] Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.713240 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-catalog-content\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.713534 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mp79\" (UniqueName: \"kubernetes.io/projected/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-kube-api-access-5mp79\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.713570 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-utilities\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.714111 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-catalog-content\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.714343 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-utilities\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.741218 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mp79\" (UniqueName: \"kubernetes.io/projected/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-kube-api-access-5mp79\") pod \"certified-operators-q5rzj\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:48 crc kubenswrapper[4886]: I0129 18:01:48.898099 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:49 crc kubenswrapper[4886]: I0129 18:01:49.557539 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5rzj"] Jan 29 18:01:49 crc kubenswrapper[4886]: I0129 18:01:49.990730 4886 generic.go:334] "Generic (PLEG): container finished" podID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerID="72d80eebb60ae8c08eef0770791971e0fbc24b07b588eb7895b7a9f050ba5462" exitCode=0 Jan 29 18:01:49 crc kubenswrapper[4886]: I0129 18:01:49.990794 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5rzj" event={"ID":"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4","Type":"ContainerDied","Data":"72d80eebb60ae8c08eef0770791971e0fbc24b07b588eb7895b7a9f050ba5462"} Jan 29 18:01:49 crc kubenswrapper[4886]: I0129 18:01:49.990835 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5rzj" event={"ID":"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4","Type":"ContainerStarted","Data":"9ab7c1c2b880cc6e9d45935d6da276f15ad16d601658ac302237a0b2c36661a6"} Jan 29 18:01:51 crc kubenswrapper[4886]: I0129 18:01:51.004965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5rzj" event={"ID":"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4","Type":"ContainerStarted","Data":"5f7aea5bf74235eef90aae221f6a2aef210bdde2ace2bb420c8f950bde0f3825"} Jan 29 18:01:52 crc kubenswrapper[4886]: I0129 18:01:52.014678 4886 generic.go:334] "Generic (PLEG): container finished" podID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerID="5f7aea5bf74235eef90aae221f6a2aef210bdde2ace2bb420c8f950bde0f3825" exitCode=0 Jan 29 18:01:52 crc kubenswrapper[4886]: I0129 18:01:52.014780 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5rzj" event={"ID":"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4","Type":"ContainerDied","Data":"5f7aea5bf74235eef90aae221f6a2aef210bdde2ace2bb420c8f950bde0f3825"} Jan 29 18:01:53 crc kubenswrapper[4886]: I0129 18:01:53.029062 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5rzj" event={"ID":"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4","Type":"ContainerStarted","Data":"60dffdf8cb175f42305628a1f37333e3d75d62cb2e4e50881c05113585bcdac4"} Jan 29 18:01:53 crc kubenswrapper[4886]: I0129 18:01:53.054104 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q5rzj" podStartSLOduration=2.639959664 podStartE2EDuration="5.054087096s" podCreationTimestamp="2026-01-29 18:01:48 +0000 UTC" firstStartedPulling="2026-01-29 18:01:49.993395938 +0000 UTC m=+5992.902115230" lastFinishedPulling="2026-01-29 18:01:52.40752336 +0000 UTC m=+5995.316242662" observedRunningTime="2026-01-29 18:01:53.043897558 +0000 UTC m=+5995.952616840" watchObservedRunningTime="2026-01-29 18:01:53.054087096 +0000 UTC m=+5995.962806358" Jan 29 18:01:53 crc kubenswrapper[4886]: I0129 18:01:53.615710 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:01:53 crc kubenswrapper[4886]: E0129 18:01:53.616278 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:01:58 crc kubenswrapper[4886]: I0129 18:01:58.898772 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:58 crc kubenswrapper[4886]: I0129 18:01:58.899512 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:58 crc kubenswrapper[4886]: I0129 18:01:58.958602 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:59 crc kubenswrapper[4886]: I0129 18:01:59.187623 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:01:59 crc kubenswrapper[4886]: I0129 18:01:59.252230 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q5rzj"] Jan 29 18:02:01 crc kubenswrapper[4886]: I0129 18:02:01.120824 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q5rzj" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerName="registry-server" containerID="cri-o://60dffdf8cb175f42305628a1f37333e3d75d62cb2e4e50881c05113585bcdac4" gracePeriod=2 Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.132475 4886 generic.go:334] "Generic (PLEG): container finished" podID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerID="60dffdf8cb175f42305628a1f37333e3d75d62cb2e4e50881c05113585bcdac4" exitCode=0 Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.132531 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5rzj" event={"ID":"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4","Type":"ContainerDied","Data":"60dffdf8cb175f42305628a1f37333e3d75d62cb2e4e50881c05113585bcdac4"} Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.280821 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.416034 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-catalog-content\") pod \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.416253 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-utilities\") pod \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.416607 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mp79\" (UniqueName: \"kubernetes.io/projected/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-kube-api-access-5mp79\") pod \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\" (UID: \"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4\") " Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.418558 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-utilities" (OuterVolumeSpecName: "utilities") pod "3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" (UID: "3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.420109 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.427713 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-kube-api-access-5mp79" (OuterVolumeSpecName: "kube-api-access-5mp79") pod "3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" (UID: "3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4"). InnerVolumeSpecName "kube-api-access-5mp79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.473473 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" (UID: "3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.521794 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 18:02:02 crc kubenswrapper[4886]: I0129 18:02:02.521827 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mp79\" (UniqueName: \"kubernetes.io/projected/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4-kube-api-access-5mp79\") on node \"crc\" DevicePath \"\"" Jan 29 18:02:03 crc kubenswrapper[4886]: I0129 18:02:03.153280 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5rzj" event={"ID":"3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4","Type":"ContainerDied","Data":"9ab7c1c2b880cc6e9d45935d6da276f15ad16d601658ac302237a0b2c36661a6"} Jan 29 18:02:03 crc kubenswrapper[4886]: I0129 18:02:03.153361 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5rzj" Jan 29 18:02:03 crc kubenswrapper[4886]: I0129 18:02:03.153737 4886 scope.go:117] "RemoveContainer" containerID="60dffdf8cb175f42305628a1f37333e3d75d62cb2e4e50881c05113585bcdac4" Jan 29 18:02:03 crc kubenswrapper[4886]: I0129 18:02:03.197861 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q5rzj"] Jan 29 18:02:03 crc kubenswrapper[4886]: I0129 18:02:03.199178 4886 scope.go:117] "RemoveContainer" containerID="5f7aea5bf74235eef90aae221f6a2aef210bdde2ace2bb420c8f950bde0f3825" Jan 29 18:02:03 crc kubenswrapper[4886]: I0129 18:02:03.210068 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q5rzj"] Jan 29 18:02:03 crc kubenswrapper[4886]: I0129 18:02:03.230659 4886 scope.go:117] "RemoveContainer" containerID="72d80eebb60ae8c08eef0770791971e0fbc24b07b588eb7895b7a9f050ba5462" Jan 29 18:02:04 crc kubenswrapper[4886]: I0129 18:02:04.640225 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" path="/var/lib/kubelet/pods/3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4/volumes" Jan 29 18:02:07 crc kubenswrapper[4886]: I0129 18:02:07.617406 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:02:07 crc kubenswrapper[4886]: E0129 18:02:07.618566 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:02:20 crc kubenswrapper[4886]: I0129 18:02:20.617517 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:02:20 crc kubenswrapper[4886]: E0129 18:02:20.619111 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:02:35 crc kubenswrapper[4886]: I0129 18:02:35.615711 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:02:35 crc kubenswrapper[4886]: E0129 18:02:35.616827 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:02:48 crc kubenswrapper[4886]: I0129 18:02:48.623845 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:02:48 crc kubenswrapper[4886]: E0129 18:02:48.625220 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:03:01 crc kubenswrapper[4886]: I0129 18:03:01.616101 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:03:01 crc kubenswrapper[4886]: E0129 18:03:01.617399 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:03:16 crc kubenswrapper[4886]: I0129 18:03:16.616620 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:03:16 crc kubenswrapper[4886]: E0129 18:03:16.619965 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.299204 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lsq2b/must-gather-jss9f"] Jan 29 18:03:19 crc kubenswrapper[4886]: E0129 18:03:19.300198 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerName="extract-content" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.300219 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerName="extract-content" Jan 29 18:03:19 crc kubenswrapper[4886]: E0129 18:03:19.300239 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerName="registry-server" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.300247 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerName="registry-server" Jan 29 18:03:19 crc kubenswrapper[4886]: E0129 18:03:19.300261 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerName="extract-utilities" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.300268 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerName="extract-utilities" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.300520 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd795ce-91a1-4c71-8332-d1c6b8b9fdf4" containerName="registry-server" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.301788 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.305785 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lsq2b"/"kube-root-ca.crt" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.305990 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lsq2b"/"default-dockercfg-xmln4" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.309002 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lsq2b"/"openshift-service-ca.crt" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.317680 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lsq2b/must-gather-jss9f"] Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.358250 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv85m\" (UniqueName: \"kubernetes.io/projected/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-kube-api-access-lv85m\") pod \"must-gather-jss9f\" (UID: \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\") " pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.358314 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-must-gather-output\") pod \"must-gather-jss9f\" (UID: \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\") " pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.460688 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv85m\" (UniqueName: \"kubernetes.io/projected/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-kube-api-access-lv85m\") pod \"must-gather-jss9f\" (UID: \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\") " pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.460760 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-must-gather-output\") pod \"must-gather-jss9f\" (UID: \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\") " pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.461230 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-must-gather-output\") pod \"must-gather-jss9f\" (UID: \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\") " pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.480165 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv85m\" (UniqueName: \"kubernetes.io/projected/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-kube-api-access-lv85m\") pod \"must-gather-jss9f\" (UID: \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\") " pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:03:19 crc kubenswrapper[4886]: I0129 18:03:19.624727 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:03:20 crc kubenswrapper[4886]: I0129 18:03:20.295716 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lsq2b/must-gather-jss9f"] Jan 29 18:03:21 crc kubenswrapper[4886]: I0129 18:03:21.251228 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/must-gather-jss9f" event={"ID":"fd01fd0d-8339-41ba-be01-6c3b723b2ec9","Type":"ContainerStarted","Data":"9ee0c2c0be8a2f9c8d72706f166b6ec33e3d7ddd1d43f8c478fdf3404b486eb6"} Jan 29 18:03:27 crc kubenswrapper[4886]: I0129 18:03:27.625011 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:03:27 crc kubenswrapper[4886]: E0129 18:03:27.625916 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:03:29 crc kubenswrapper[4886]: I0129 18:03:29.338770 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/must-gather-jss9f" event={"ID":"fd01fd0d-8339-41ba-be01-6c3b723b2ec9","Type":"ContainerStarted","Data":"941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480"} Jan 29 18:03:29 crc kubenswrapper[4886]: I0129 18:03:29.339440 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/must-gather-jss9f" event={"ID":"fd01fd0d-8339-41ba-be01-6c3b723b2ec9","Type":"ContainerStarted","Data":"2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71"} Jan 29 18:03:29 crc kubenswrapper[4886]: I0129 18:03:29.371786 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lsq2b/must-gather-jss9f" podStartSLOduration=2.255223162 podStartE2EDuration="10.371766909s" podCreationTimestamp="2026-01-29 18:03:19 +0000 UTC" firstStartedPulling="2026-01-29 18:03:20.298512454 +0000 UTC m=+6083.207231726" lastFinishedPulling="2026-01-29 18:03:28.415056191 +0000 UTC m=+6091.323775473" observedRunningTime="2026-01-29 18:03:29.359484461 +0000 UTC m=+6092.268203733" watchObservedRunningTime="2026-01-29 18:03:29.371766909 +0000 UTC m=+6092.280486181" Jan 29 18:03:33 crc kubenswrapper[4886]: I0129 18:03:33.900909 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lsq2b/crc-debug-lpc7l"] Jan 29 18:03:33 crc kubenswrapper[4886]: I0129 18:03:33.903528 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:03:34 crc kubenswrapper[4886]: I0129 18:03:34.018179 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcswg\" (UniqueName: \"kubernetes.io/projected/69c46e61-34d0-44e6-89e0-2f9d618c543a-kube-api-access-xcswg\") pod \"crc-debug-lpc7l\" (UID: \"69c46e61-34d0-44e6-89e0-2f9d618c543a\") " pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:03:34 crc kubenswrapper[4886]: I0129 18:03:34.018447 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69c46e61-34d0-44e6-89e0-2f9d618c543a-host\") pod \"crc-debug-lpc7l\" (UID: \"69c46e61-34d0-44e6-89e0-2f9d618c543a\") " pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:03:34 crc kubenswrapper[4886]: I0129 18:03:34.120715 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69c46e61-34d0-44e6-89e0-2f9d618c543a-host\") pod \"crc-debug-lpc7l\" (UID: \"69c46e61-34d0-44e6-89e0-2f9d618c543a\") " pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:03:34 crc kubenswrapper[4886]: I0129 18:03:34.120840 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69c46e61-34d0-44e6-89e0-2f9d618c543a-host\") pod \"crc-debug-lpc7l\" (UID: \"69c46e61-34d0-44e6-89e0-2f9d618c543a\") " pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:03:34 crc kubenswrapper[4886]: I0129 18:03:34.121122 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcswg\" (UniqueName: \"kubernetes.io/projected/69c46e61-34d0-44e6-89e0-2f9d618c543a-kube-api-access-xcswg\") pod \"crc-debug-lpc7l\" (UID: \"69c46e61-34d0-44e6-89e0-2f9d618c543a\") " pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:03:34 crc kubenswrapper[4886]: I0129 18:03:34.156159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcswg\" (UniqueName: \"kubernetes.io/projected/69c46e61-34d0-44e6-89e0-2f9d618c543a-kube-api-access-xcswg\") pod \"crc-debug-lpc7l\" (UID: \"69c46e61-34d0-44e6-89e0-2f9d618c543a\") " pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:03:34 crc kubenswrapper[4886]: I0129 18:03:34.227695 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:03:34 crc kubenswrapper[4886]: I0129 18:03:34.400982 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" event={"ID":"69c46e61-34d0-44e6-89e0-2f9d618c543a","Type":"ContainerStarted","Data":"3097596f1f56f04205d69d1e7a2a030494676385b6096f7976a95369dd790bf0"} Jan 29 18:03:40 crc kubenswrapper[4886]: I0129 18:03:40.617231 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:03:40 crc kubenswrapper[4886]: E0129 18:03:40.617879 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:03:47 crc kubenswrapper[4886]: I0129 18:03:47.521270 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" event={"ID":"69c46e61-34d0-44e6-89e0-2f9d618c543a","Type":"ContainerStarted","Data":"9151f75a515b793b76d61e304966261ea994214c86da5ff66a0d5a788f6197a1"} Jan 29 18:03:47 crc kubenswrapper[4886]: I0129 18:03:47.542771 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" podStartSLOduration=1.938434142 podStartE2EDuration="14.542755533s" podCreationTimestamp="2026-01-29 18:03:33 +0000 UTC" firstStartedPulling="2026-01-29 18:03:34.295552039 +0000 UTC m=+6097.204271311" lastFinishedPulling="2026-01-29 18:03:46.89987342 +0000 UTC m=+6109.808592702" observedRunningTime="2026-01-29 18:03:47.538053379 +0000 UTC m=+6110.446772651" watchObservedRunningTime="2026-01-29 18:03:47.542755533 +0000 UTC m=+6110.451474805" Jan 29 18:03:51 crc kubenswrapper[4886]: I0129 18:03:51.615064 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:03:51 crc kubenswrapper[4886]: E0129 18:03:51.616017 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:04:04 crc kubenswrapper[4886]: I0129 18:04:04.619008 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:04:04 crc kubenswrapper[4886]: E0129 18:04:04.619659 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:04:09 crc kubenswrapper[4886]: I0129 18:04:09.753940 4886 generic.go:334] "Generic (PLEG): container finished" podID="69c46e61-34d0-44e6-89e0-2f9d618c543a" containerID="9151f75a515b793b76d61e304966261ea994214c86da5ff66a0d5a788f6197a1" exitCode=0 Jan 29 18:04:09 crc kubenswrapper[4886]: I0129 18:04:09.754009 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" event={"ID":"69c46e61-34d0-44e6-89e0-2f9d618c543a","Type":"ContainerDied","Data":"9151f75a515b793b76d61e304966261ea994214c86da5ff66a0d5a788f6197a1"} Jan 29 18:04:10 crc kubenswrapper[4886]: I0129 18:04:10.922133 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:04:10 crc kubenswrapper[4886]: I0129 18:04:10.949984 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lsq2b/crc-debug-lpc7l"] Jan 29 18:04:10 crc kubenswrapper[4886]: I0129 18:04:10.959804 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lsq2b/crc-debug-lpc7l"] Jan 29 18:04:11 crc kubenswrapper[4886]: I0129 18:04:11.014956 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69c46e61-34d0-44e6-89e0-2f9d618c543a-host\") pod \"69c46e61-34d0-44e6-89e0-2f9d618c543a\" (UID: \"69c46e61-34d0-44e6-89e0-2f9d618c543a\") " Jan 29 18:04:11 crc kubenswrapper[4886]: I0129 18:04:11.015079 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69c46e61-34d0-44e6-89e0-2f9d618c543a-host" (OuterVolumeSpecName: "host") pod "69c46e61-34d0-44e6-89e0-2f9d618c543a" (UID: "69c46e61-34d0-44e6-89e0-2f9d618c543a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 18:04:11 crc kubenswrapper[4886]: I0129 18:04:11.015106 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcswg\" (UniqueName: \"kubernetes.io/projected/69c46e61-34d0-44e6-89e0-2f9d618c543a-kube-api-access-xcswg\") pod \"69c46e61-34d0-44e6-89e0-2f9d618c543a\" (UID: \"69c46e61-34d0-44e6-89e0-2f9d618c543a\") " Jan 29 18:04:11 crc kubenswrapper[4886]: I0129 18:04:11.015632 4886 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69c46e61-34d0-44e6-89e0-2f9d618c543a-host\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:11 crc kubenswrapper[4886]: I0129 18:04:11.020529 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c46e61-34d0-44e6-89e0-2f9d618c543a-kube-api-access-xcswg" (OuterVolumeSpecName: "kube-api-access-xcswg") pod "69c46e61-34d0-44e6-89e0-2f9d618c543a" (UID: "69c46e61-34d0-44e6-89e0-2f9d618c543a"). InnerVolumeSpecName "kube-api-access-xcswg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:04:11 crc kubenswrapper[4886]: I0129 18:04:11.117445 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcswg\" (UniqueName: \"kubernetes.io/projected/69c46e61-34d0-44e6-89e0-2f9d618c543a-kube-api-access-xcswg\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:11 crc kubenswrapper[4886]: I0129 18:04:11.789546 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3097596f1f56f04205d69d1e7a2a030494676385b6096f7976a95369dd790bf0" Jan 29 18:04:11 crc kubenswrapper[4886]: I0129 18:04:11.789675 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/crc-debug-lpc7l" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.164294 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lsq2b/crc-debug-f6g6p"] Jan 29 18:04:12 crc kubenswrapper[4886]: E0129 18:04:12.165151 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c46e61-34d0-44e6-89e0-2f9d618c543a" containerName="container-00" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.165180 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c46e61-34d0-44e6-89e0-2f9d618c543a" containerName="container-00" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.165515 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c46e61-34d0-44e6-89e0-2f9d618c543a" containerName="container-00" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.166731 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.256038 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-host\") pod \"crc-debug-f6g6p\" (UID: \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\") " pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.256149 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxthf\" (UniqueName: \"kubernetes.io/projected/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-kube-api-access-qxthf\") pod \"crc-debug-f6g6p\" (UID: \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\") " pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.359046 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxthf\" (UniqueName: \"kubernetes.io/projected/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-kube-api-access-qxthf\") pod \"crc-debug-f6g6p\" (UID: \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\") " pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.365445 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-host\") pod \"crc-debug-f6g6p\" (UID: \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\") " pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.365578 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-host\") pod \"crc-debug-f6g6p\" (UID: \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\") " pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.391417 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxthf\" (UniqueName: \"kubernetes.io/projected/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-kube-api-access-qxthf\") pod \"crc-debug-f6g6p\" (UID: \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\") " pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.494469 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.644198 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c46e61-34d0-44e6-89e0-2f9d618c543a" path="/var/lib/kubelet/pods/69c46e61-34d0-44e6-89e0-2f9d618c543a/volumes" Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.799802 4886 generic.go:334] "Generic (PLEG): container finished" podID="ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf" containerID="4cbfd678d0e8c9a0c43080d33d221a63872ea34632a51cb0a6c22a5407b09f79" exitCode=1 Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.799905 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" event={"ID":"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf","Type":"ContainerDied","Data":"4cbfd678d0e8c9a0c43080d33d221a63872ea34632a51cb0a6c22a5407b09f79"} Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.800352 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" event={"ID":"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf","Type":"ContainerStarted","Data":"7ac46cc770b9f49f63ae79a1a7b6a62e74f610aa8420a3a20c45964faaf5ceab"} Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.844867 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lsq2b/crc-debug-f6g6p"] Jan 29 18:04:12 crc kubenswrapper[4886]: I0129 18:04:12.853458 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lsq2b/crc-debug-f6g6p"] Jan 29 18:04:13 crc kubenswrapper[4886]: I0129 18:04:13.923909 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.004278 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxthf\" (UniqueName: \"kubernetes.io/projected/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-kube-api-access-qxthf\") pod \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\" (UID: \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\") " Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.004562 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-host\") pod \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\" (UID: \"ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf\") " Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.004924 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-host" (OuterVolumeSpecName: "host") pod "ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf" (UID: "ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.005407 4886 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-host\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.017043 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-kube-api-access-qxthf" (OuterVolumeSpecName: "kube-api-access-qxthf") pod "ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf" (UID: "ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf"). InnerVolumeSpecName "kube-api-access-qxthf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.109106 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxthf\" (UniqueName: \"kubernetes.io/projected/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf-kube-api-access-qxthf\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.631665 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf" path="/var/lib/kubelet/pods/ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf/volumes" Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.820493 4886 scope.go:117] "RemoveContainer" containerID="4cbfd678d0e8c9a0c43080d33d221a63872ea34632a51cb0a6c22a5407b09f79" Jan 29 18:04:14 crc kubenswrapper[4886]: I0129 18:04:14.820555 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/crc-debug-f6g6p" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.568300 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvr5"] Jan 29 18:04:15 crc kubenswrapper[4886]: E0129 18:04:15.569311 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf" containerName="container-00" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.569406 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf" containerName="container-00" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.569903 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff59c3a7-8e4e-4c1d-a0f0-ffd6fc31ddbf" containerName="container-00" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.583574 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.586235 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvr5"] Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.686200 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-utilities\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.686295 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-catalog-content\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.686417 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjc5\" (UniqueName: \"kubernetes.io/projected/cf73c735-d3aa-476b-9390-6a150d51a290-kube-api-access-lfjc5\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.790177 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjc5\" (UniqueName: \"kubernetes.io/projected/cf73c735-d3aa-476b-9390-6a150d51a290-kube-api-access-lfjc5\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.790372 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-utilities\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.790432 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-catalog-content\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.790982 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-utilities\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.791032 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-catalog-content\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.817265 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjc5\" (UniqueName: \"kubernetes.io/projected/cf73c735-d3aa-476b-9390-6a150d51a290-kube-api-access-lfjc5\") pod \"redhat-marketplace-kmvr5\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:15 crc kubenswrapper[4886]: I0129 18:04:15.909037 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:16 crc kubenswrapper[4886]: I0129 18:04:16.419349 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvr5"] Jan 29 18:04:16 crc kubenswrapper[4886]: W0129 18:04:16.421580 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf73c735_d3aa_476b_9390_6a150d51a290.slice/crio-3ea04023ad6f2098f354054573352189e64af7b720c4d23b8d794816a83966a1 WatchSource:0}: Error finding container 3ea04023ad6f2098f354054573352189e64af7b720c4d23b8d794816a83966a1: Status 404 returned error can't find the container with id 3ea04023ad6f2098f354054573352189e64af7b720c4d23b8d794816a83966a1 Jan 29 18:04:16 crc kubenswrapper[4886]: I0129 18:04:16.847066 4886 generic.go:334] "Generic (PLEG): container finished" podID="cf73c735-d3aa-476b-9390-6a150d51a290" containerID="54c179145b068653a1e221165954ed6dc1e5732be8151bfe1ac6f1f61a83422f" exitCode=0 Jan 29 18:04:16 crc kubenswrapper[4886]: I0129 18:04:16.847137 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvr5" event={"ID":"cf73c735-d3aa-476b-9390-6a150d51a290","Type":"ContainerDied","Data":"54c179145b068653a1e221165954ed6dc1e5732be8151bfe1ac6f1f61a83422f"} Jan 29 18:04:16 crc kubenswrapper[4886]: I0129 18:04:16.847310 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvr5" event={"ID":"cf73c735-d3aa-476b-9390-6a150d51a290","Type":"ContainerStarted","Data":"3ea04023ad6f2098f354054573352189e64af7b720c4d23b8d794816a83966a1"} Jan 29 18:04:18 crc kubenswrapper[4886]: I0129 18:04:18.864849 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvr5" event={"ID":"cf73c735-d3aa-476b-9390-6a150d51a290","Type":"ContainerStarted","Data":"dff1bde7e6d514472b2010c2fd3b5381b5a397e39be70a375089aa152b0fac0f"} Jan 29 18:04:19 crc kubenswrapper[4886]: I0129 18:04:19.616111 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:04:19 crc kubenswrapper[4886]: E0129 18:04:19.616760 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:04:19 crc kubenswrapper[4886]: I0129 18:04:19.876083 4886 generic.go:334] "Generic (PLEG): container finished" podID="cf73c735-d3aa-476b-9390-6a150d51a290" containerID="dff1bde7e6d514472b2010c2fd3b5381b5a397e39be70a375089aa152b0fac0f" exitCode=0 Jan 29 18:04:19 crc kubenswrapper[4886]: I0129 18:04:19.876124 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvr5" event={"ID":"cf73c735-d3aa-476b-9390-6a150d51a290","Type":"ContainerDied","Data":"dff1bde7e6d514472b2010c2fd3b5381b5a397e39be70a375089aa152b0fac0f"} Jan 29 18:04:20 crc kubenswrapper[4886]: I0129 18:04:20.890073 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvr5" event={"ID":"cf73c735-d3aa-476b-9390-6a150d51a290","Type":"ContainerStarted","Data":"27750b35201061f1ffd4a205c3e0c5eef07cfdb632a99934639047305555bc63"} Jan 29 18:04:20 crc kubenswrapper[4886]: I0129 18:04:20.917017 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kmvr5" podStartSLOduration=2.483100357 podStartE2EDuration="5.916998993s" podCreationTimestamp="2026-01-29 18:04:15 +0000 UTC" firstStartedPulling="2026-01-29 18:04:16.849516628 +0000 UTC m=+6139.758235910" lastFinishedPulling="2026-01-29 18:04:20.283415254 +0000 UTC m=+6143.192134546" observedRunningTime="2026-01-29 18:04:20.906233058 +0000 UTC m=+6143.814952340" watchObservedRunningTime="2026-01-29 18:04:20.916998993 +0000 UTC m=+6143.825718275" Jan 29 18:04:20 crc kubenswrapper[4886]: I0129 18:04:20.969825 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lj627"] Jan 29 18:04:20 crc kubenswrapper[4886]: I0129 18:04:20.972741 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:20 crc kubenswrapper[4886]: I0129 18:04:20.997029 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lj627"] Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.135502 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-utilities\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.135691 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-catalog-content\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.135928 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn8wx\" (UniqueName: \"kubernetes.io/projected/ace6b3f5-2f50-4320-87db-40229f5f2cfa-kube-api-access-bn8wx\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.238480 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn8wx\" (UniqueName: \"kubernetes.io/projected/ace6b3f5-2f50-4320-87db-40229f5f2cfa-kube-api-access-bn8wx\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.238627 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-utilities\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.238698 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-catalog-content\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.239222 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-catalog-content\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.239374 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-utilities\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.256465 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn8wx\" (UniqueName: \"kubernetes.io/projected/ace6b3f5-2f50-4320-87db-40229f5f2cfa-kube-api-access-bn8wx\") pod \"redhat-operators-lj627\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.307283 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.830082 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lj627"] Jan 29 18:04:21 crc kubenswrapper[4886]: I0129 18:04:21.905469 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj627" event={"ID":"ace6b3f5-2f50-4320-87db-40229f5f2cfa","Type":"ContainerStarted","Data":"933561cc3f3d4b68e66a04703782c2021621ec267367f5610272c1e684a67323"} Jan 29 18:04:22 crc kubenswrapper[4886]: I0129 18:04:22.916275 4886 generic.go:334] "Generic (PLEG): container finished" podID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerID="468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06" exitCode=0 Jan 29 18:04:22 crc kubenswrapper[4886]: I0129 18:04:22.916384 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj627" event={"ID":"ace6b3f5-2f50-4320-87db-40229f5f2cfa","Type":"ContainerDied","Data":"468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06"} Jan 29 18:04:23 crc kubenswrapper[4886]: I0129 18:04:23.927062 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj627" event={"ID":"ace6b3f5-2f50-4320-87db-40229f5f2cfa","Type":"ContainerStarted","Data":"7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592"} Jan 29 18:04:25 crc kubenswrapper[4886]: I0129 18:04:25.909471 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:25 crc kubenswrapper[4886]: I0129 18:04:25.910178 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:25 crc kubenswrapper[4886]: I0129 18:04:25.979072 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:26 crc kubenswrapper[4886]: I0129 18:04:26.032062 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:29 crc kubenswrapper[4886]: I0129 18:04:29.987276 4886 generic.go:334] "Generic (PLEG): container finished" podID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerID="7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592" exitCode=0 Jan 29 18:04:29 crc kubenswrapper[4886]: I0129 18:04:29.987363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj627" event={"ID":"ace6b3f5-2f50-4320-87db-40229f5f2cfa","Type":"ContainerDied","Data":"7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592"} Jan 29 18:04:31 crc kubenswrapper[4886]: I0129 18:04:31.615764 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:04:31 crc kubenswrapper[4886]: E0129 18:04:31.616408 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:04:32 crc kubenswrapper[4886]: I0129 18:04:32.010937 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj627" event={"ID":"ace6b3f5-2f50-4320-87db-40229f5f2cfa","Type":"ContainerStarted","Data":"11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418"} Jan 29 18:04:32 crc kubenswrapper[4886]: I0129 18:04:32.034742 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lj627" podStartSLOduration=3.433709689 podStartE2EDuration="12.034711853s" podCreationTimestamp="2026-01-29 18:04:20 +0000 UTC" firstStartedPulling="2026-01-29 18:04:22.919689695 +0000 UTC m=+6145.828408977" lastFinishedPulling="2026-01-29 18:04:31.520691859 +0000 UTC m=+6154.429411141" observedRunningTime="2026-01-29 18:04:32.030440522 +0000 UTC m=+6154.939159794" watchObservedRunningTime="2026-01-29 18:04:32.034711853 +0000 UTC m=+6154.943431155" Jan 29 18:04:33 crc kubenswrapper[4886]: I0129 18:04:33.573203 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvr5"] Jan 29 18:04:33 crc kubenswrapper[4886]: I0129 18:04:33.573937 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kmvr5" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" containerName="registry-server" containerID="cri-o://27750b35201061f1ffd4a205c3e0c5eef07cfdb632a99934639047305555bc63" gracePeriod=2 Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.045350 4886 generic.go:334] "Generic (PLEG): container finished" podID="cf73c735-d3aa-476b-9390-6a150d51a290" containerID="27750b35201061f1ffd4a205c3e0c5eef07cfdb632a99934639047305555bc63" exitCode=0 Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.045438 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvr5" event={"ID":"cf73c735-d3aa-476b-9390-6a150d51a290","Type":"ContainerDied","Data":"27750b35201061f1ffd4a205c3e0c5eef07cfdb632a99934639047305555bc63"} Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.223595 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.296821 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfjc5\" (UniqueName: \"kubernetes.io/projected/cf73c735-d3aa-476b-9390-6a150d51a290-kube-api-access-lfjc5\") pod \"cf73c735-d3aa-476b-9390-6a150d51a290\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.296918 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-utilities\") pod \"cf73c735-d3aa-476b-9390-6a150d51a290\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.296968 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-catalog-content\") pod \"cf73c735-d3aa-476b-9390-6a150d51a290\" (UID: \"cf73c735-d3aa-476b-9390-6a150d51a290\") " Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.298206 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-utilities" (OuterVolumeSpecName: "utilities") pod "cf73c735-d3aa-476b-9390-6a150d51a290" (UID: "cf73c735-d3aa-476b-9390-6a150d51a290"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.299131 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.305834 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf73c735-d3aa-476b-9390-6a150d51a290-kube-api-access-lfjc5" (OuterVolumeSpecName: "kube-api-access-lfjc5") pod "cf73c735-d3aa-476b-9390-6a150d51a290" (UID: "cf73c735-d3aa-476b-9390-6a150d51a290"). InnerVolumeSpecName "kube-api-access-lfjc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.324674 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf73c735-d3aa-476b-9390-6a150d51a290" (UID: "cf73c735-d3aa-476b-9390-6a150d51a290"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.402102 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfjc5\" (UniqueName: \"kubernetes.io/projected/cf73c735-d3aa-476b-9390-6a150d51a290-kube-api-access-lfjc5\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:34 crc kubenswrapper[4886]: I0129 18:04:34.402153 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf73c735-d3aa-476b-9390-6a150d51a290-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:35 crc kubenswrapper[4886]: I0129 18:04:35.056951 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvr5" event={"ID":"cf73c735-d3aa-476b-9390-6a150d51a290","Type":"ContainerDied","Data":"3ea04023ad6f2098f354054573352189e64af7b720c4d23b8d794816a83966a1"} Jan 29 18:04:35 crc kubenswrapper[4886]: I0129 18:04:35.058208 4886 scope.go:117] "RemoveContainer" containerID="27750b35201061f1ffd4a205c3e0c5eef07cfdb632a99934639047305555bc63" Jan 29 18:04:35 crc kubenswrapper[4886]: I0129 18:04:35.058499 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmvr5" Jan 29 18:04:35 crc kubenswrapper[4886]: I0129 18:04:35.087941 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvr5"] Jan 29 18:04:35 crc kubenswrapper[4886]: I0129 18:04:35.095691 4886 scope.go:117] "RemoveContainer" containerID="dff1bde7e6d514472b2010c2fd3b5381b5a397e39be70a375089aa152b0fac0f" Jan 29 18:04:35 crc kubenswrapper[4886]: I0129 18:04:35.097234 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvr5"] Jan 29 18:04:35 crc kubenswrapper[4886]: I0129 18:04:35.118004 4886 scope.go:117] "RemoveContainer" containerID="54c179145b068653a1e221165954ed6dc1e5732be8151bfe1ac6f1f61a83422f" Jan 29 18:04:36 crc kubenswrapper[4886]: I0129 18:04:36.626887 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" path="/var/lib/kubelet/pods/cf73c735-d3aa-476b-9390-6a150d51a290/volumes" Jan 29 18:04:41 crc kubenswrapper[4886]: I0129 18:04:41.308271 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:41 crc kubenswrapper[4886]: I0129 18:04:41.308792 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:42 crc kubenswrapper[4886]: I0129 18:04:42.362229 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lj627" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="registry-server" probeResult="failure" output=< Jan 29 18:04:42 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 18:04:42 crc kubenswrapper[4886]: > Jan 29 18:04:45 crc kubenswrapper[4886]: I0129 18:04:45.615666 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:04:45 crc kubenswrapper[4886]: E0129 18:04:45.616351 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:04:51 crc kubenswrapper[4886]: I0129 18:04:51.405695 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:51 crc kubenswrapper[4886]: I0129 18:04:51.462696 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:52 crc kubenswrapper[4886]: I0129 18:04:52.175359 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lj627"] Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.230173 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lj627" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="registry-server" containerID="cri-o://11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418" gracePeriod=2 Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.797597 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.866101 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-catalog-content\") pod \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.866216 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn8wx\" (UniqueName: \"kubernetes.io/projected/ace6b3f5-2f50-4320-87db-40229f5f2cfa-kube-api-access-bn8wx\") pod \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.866291 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-utilities\") pod \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\" (UID: \"ace6b3f5-2f50-4320-87db-40229f5f2cfa\") " Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.867183 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-utilities" (OuterVolumeSpecName: "utilities") pod "ace6b3f5-2f50-4320-87db-40229f5f2cfa" (UID: "ace6b3f5-2f50-4320-87db-40229f5f2cfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.876546 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace6b3f5-2f50-4320-87db-40229f5f2cfa-kube-api-access-bn8wx" (OuterVolumeSpecName: "kube-api-access-bn8wx") pod "ace6b3f5-2f50-4320-87db-40229f5f2cfa" (UID: "ace6b3f5-2f50-4320-87db-40229f5f2cfa"). InnerVolumeSpecName "kube-api-access-bn8wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.970008 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.970075 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn8wx\" (UniqueName: \"kubernetes.io/projected/ace6b3f5-2f50-4320-87db-40229f5f2cfa-kube-api-access-bn8wx\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:53 crc kubenswrapper[4886]: I0129 18:04:53.990066 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ace6b3f5-2f50-4320-87db-40229f5f2cfa" (UID: "ace6b3f5-2f50-4320-87db-40229f5f2cfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.072050 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace6b3f5-2f50-4320-87db-40229f5f2cfa-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.244189 4886 generic.go:334] "Generic (PLEG): container finished" podID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerID="11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418" exitCode=0 Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.244227 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj627" event={"ID":"ace6b3f5-2f50-4320-87db-40229f5f2cfa","Type":"ContainerDied","Data":"11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418"} Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.244280 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj627" event={"ID":"ace6b3f5-2f50-4320-87db-40229f5f2cfa","Type":"ContainerDied","Data":"933561cc3f3d4b68e66a04703782c2021621ec267367f5610272c1e684a67323"} Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.244305 4886 scope.go:117] "RemoveContainer" containerID="11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.244315 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj627" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.293497 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lj627"] Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.293671 4886 scope.go:117] "RemoveContainer" containerID="7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.302236 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lj627"] Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.328047 4886 scope.go:117] "RemoveContainer" containerID="468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.417723 4886 scope.go:117] "RemoveContainer" containerID="11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418" Jan 29 18:04:54 crc kubenswrapper[4886]: E0129 18:04:54.421770 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418\": container with ID starting with 11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418 not found: ID does not exist" containerID="11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.421816 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418"} err="failed to get container status \"11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418\": rpc error: code = NotFound desc = could not find container \"11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418\": container with ID starting with 11705f34993c6638ba8642b38964a86a5de557e9f2d5c74da2dc5a7240803418 not found: ID does not exist" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.421852 4886 scope.go:117] "RemoveContainer" containerID="7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592" Jan 29 18:04:54 crc kubenswrapper[4886]: E0129 18:04:54.422260 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592\": container with ID starting with 7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592 not found: ID does not exist" containerID="7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.422341 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592"} err="failed to get container status \"7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592\": rpc error: code = NotFound desc = could not find container \"7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592\": container with ID starting with 7d317f44136dcc76fb7783151dfa87ebccf117dd0b425dcb78ac3d5980079592 not found: ID does not exist" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.422369 4886 scope.go:117] "RemoveContainer" containerID="468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06" Jan 29 18:04:54 crc kubenswrapper[4886]: E0129 18:04:54.424701 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06\": container with ID starting with 468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06 not found: ID does not exist" containerID="468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.424770 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06"} err="failed to get container status \"468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06\": rpc error: code = NotFound desc = could not find container \"468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06\": container with ID starting with 468f6a38bd34b0f68ce35ac9861dbb58e082aa0417a0fb5de5b0cab0abc3db06 not found: ID does not exist" Jan 29 18:04:54 crc kubenswrapper[4886]: I0129 18:04:54.625942 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" path="/var/lib/kubelet/pods/ace6b3f5-2f50-4320-87db-40229f5f2cfa/volumes" Jan 29 18:04:58 crc kubenswrapper[4886]: I0129 18:04:58.622229 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:04:58 crc kubenswrapper[4886]: E0129 18:04:58.622766 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.072025 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fb894ff6d-w7s26_b87936a5-19e1-4a58-948f-1f569c08bb6b/barbican-api/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.277547 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fb894ff6d-w7s26_b87936a5-19e1-4a58-948f-1f569c08bb6b/barbican-api-log/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.339196 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-85cc5d579d-jhqqd_054e527c-8ce1-4d03-8fef-0430934daba3/barbican-keystone-listener-log/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.366508 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-85cc5d579d-jhqqd_054e527c-8ce1-4d03-8fef-0430934daba3/barbican-keystone-listener/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.542781 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f4657cb95-4tfvc_8f83894a-73ec-405a-bdd2-2044b3f9140a/barbican-worker-log/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.552953 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f4657cb95-4tfvc_8f83894a-73ec-405a-bdd2-2044b3f9140a/barbican-worker/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.734742 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_23f9894b-5940-4f78-9062-719f7e7eca3a/ceilometer-central-agent/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.735085 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_23f9894b-5940-4f78-9062-719f7e7eca3a/ceilometer-notification-agent/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.789562 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_23f9894b-5940-4f78-9062-719f7e7eca3a/sg-core/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.790845 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_23f9894b-5940-4f78-9062-719f7e7eca3a/proxy-httpd/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.975167 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3573eaa4-4c27-4747-a691-15ae61d152f3/cinder-api/0.log" Jan 29 18:05:07 crc kubenswrapper[4886]: I0129 18:05:07.987213 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3573eaa4-4c27-4747-a691-15ae61d152f3/cinder-api-log/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.157248 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d9b55479-5ea1-4a5b-9e34-e83313b04dec/cinder-scheduler/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.232359 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-fh86h_efe27968-ef82-463a-8852-222528e7980d/init/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.259451 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d9b55479-5ea1-4a5b-9e34-e83313b04dec/probe/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.449359 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-fh86h_efe27968-ef82-463a-8852-222528e7980d/init/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.489183 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-fh86h_efe27968-ef82-463a-8852-222528e7980d/dnsmasq-dns/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.516596 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2dbf03ea-9df9-4f03-aee9-113dabed1c7a/glance-httpd/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.610829 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2dbf03ea-9df9-4f03-aee9-113dabed1c7a/glance-log/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.720899 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81437be4-b399-40e9-9c33-e71319326af8/glance-httpd/0.log" Jan 29 18:05:08 crc kubenswrapper[4886]: I0129 18:05:08.740557 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81437be4-b399-40e9-9c33-e71319326af8/glance-log/0.log" Jan 29 18:05:09 crc kubenswrapper[4886]: I0129 18:05:09.441011 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5f6fd667fd-4s5hk_3b8fde91-2520-41c6-bc79-1f6b186dcbf0/heat-engine/0.log" Jan 29 18:05:09 crc kubenswrapper[4886]: I0129 18:05:09.616349 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7c65449fdf-42rxg_c5fcdcf3-c18b-4f0b-ac46-7be1d56fc3a2/heat-cfnapi/0.log" Jan 29 18:05:09 crc kubenswrapper[4886]: I0129 18:05:09.682789 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-64bb5bfdfc-h2mgd_a004f05d-8133-4d8e-9e3c-d5c9411351ad/heat-api/0.log" Jan 29 18:05:09 crc kubenswrapper[4886]: I0129 18:05:09.833508 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5499bdc9-q6hr4_d9e327b0-6e20-4b1d-a18f-64b8b49ef36d/keystone-api/0.log" Jan 29 18:05:09 crc kubenswrapper[4886]: I0129 18:05:09.896413 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29495161-tqptf_62fe5584-12c8-4933-868d-bbb9e04f7bb3/keystone-cron/0.log" Jan 29 18:05:10 crc kubenswrapper[4886]: I0129 18:05:10.122220 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_fa42ea64-73bc-439c-802c-65ef65a39015/kube-state-metrics/0.log" Jan 29 18:05:10 crc kubenswrapper[4886]: I0129 18:05:10.390832 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_aa7423ef-f68a-4969-a81b-fd2ce4dbc16a/mysqld-exporter/0.log" Jan 29 18:05:10 crc kubenswrapper[4886]: I0129 18:05:10.813146 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-846d49f49c-kc98b_344feff6-8139-425e-b7dc-f35fe5b17247/neutron-api/0.log" Jan 29 18:05:10 crc kubenswrapper[4886]: I0129 18:05:10.827157 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-846d49f49c-kc98b_344feff6-8139-425e-b7dc-f35fe5b17247/neutron-httpd/0.log" Jan 29 18:05:11 crc kubenswrapper[4886]: I0129 18:05:11.178642 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cbffe358-e916-4693-b76d-09fd332a7082/nova-api-log/0.log" Jan 29 18:05:11 crc kubenswrapper[4886]: I0129 18:05:11.277125 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bb22403c-016a-48ea-954a-b7b14ea77d7f/nova-cell0-conductor-conductor/0.log" Jan 29 18:05:11 crc kubenswrapper[4886]: I0129 18:05:11.460730 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cbffe358-e916-4693-b76d-09fd332a7082/nova-api-api/0.log" Jan 29 18:05:11 crc kubenswrapper[4886]: I0129 18:05:11.518795 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_08160d2e-8072-4d08-9dd2-4b5f256b6d9d/nova-cell1-conductor-conductor/0.log" Jan 29 18:05:11 crc kubenswrapper[4886]: I0129 18:05:11.614741 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:05:11 crc kubenswrapper[4886]: E0129 18:05:11.615076 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:05:11 crc kubenswrapper[4886]: I0129 18:05:11.862491 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c2249ae5-133d-4750-9d7a-529dc8c9b39a/nova-cell1-novncproxy-novncproxy/0.log" Jan 29 18:05:11 crc kubenswrapper[4886]: I0129 18:05:11.935308 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9a568175-84cc-425a-9adf-5013a7fb5171/nova-metadata-log/0.log" Jan 29 18:05:12 crc kubenswrapper[4886]: I0129 18:05:12.232512 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fc4c563c-21d3-41cf-aabf-dd4429d59b62/nova-scheduler-scheduler/0.log" Jan 29 18:05:12 crc kubenswrapper[4886]: I0129 18:05:12.349671 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_954d7d1e-fd92-4c83-87d8-87a1f866dbbe/mysql-bootstrap/0.log" Jan 29 18:05:12 crc kubenswrapper[4886]: I0129 18:05:12.696982 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_954d7d1e-fd92-4c83-87d8-87a1f866dbbe/mysql-bootstrap/0.log" Jan 29 18:05:12 crc kubenswrapper[4886]: I0129 18:05:12.744872 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_954d7d1e-fd92-4c83-87d8-87a1f866dbbe/galera/0.log" Jan 29 18:05:12 crc kubenswrapper[4886]: I0129 18:05:12.937272 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_98bed306-aa68-4e53-affc-e04497079ccb/mysql-bootstrap/0.log" Jan 29 18:05:13 crc kubenswrapper[4886]: I0129 18:05:13.160220 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_98bed306-aa68-4e53-affc-e04497079ccb/mysql-bootstrap/0.log" Jan 29 18:05:13 crc kubenswrapper[4886]: I0129 18:05:13.160694 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_98bed306-aa68-4e53-affc-e04497079ccb/galera/0.log" Jan 29 18:05:13 crc kubenswrapper[4886]: I0129 18:05:13.378583 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_be43aab6-3888-4260-a85c-147e2ae0a36d/openstackclient/0.log" Jan 29 18:05:13 crc kubenswrapper[4886]: I0129 18:05:13.451243 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-b7d9p_544b4515-481c-47f1-acb6-ed332a3497d4/ovn-controller/0.log" Jan 29 18:05:13 crc kubenswrapper[4886]: I0129 18:05:13.559416 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9a568175-84cc-425a-9adf-5013a7fb5171/nova-metadata-metadata/0.log" Jan 29 18:05:13 crc kubenswrapper[4886]: I0129 18:05:13.734630 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6f8zt_ff160c34-86ad-4048-9c67-2071e6c38373/openstack-network-exporter/0.log" Jan 29 18:05:13 crc kubenswrapper[4886]: I0129 18:05:13.791375 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xhds2_03dc141f-69cc-4cb4-af0b-acf85642b86e/ovsdb-server-init/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.024348 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xhds2_03dc141f-69cc-4cb4-af0b-acf85642b86e/ovsdb-server-init/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.035950 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xhds2_03dc141f-69cc-4cb4-af0b-acf85642b86e/ovs-vswitchd/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.068699 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xhds2_03dc141f-69cc-4cb4-af0b-acf85642b86e/ovsdb-server/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.325739 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dc04c928-b93c-49a3-a653-f82b5e686da5/ovn-northd/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.337695 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dc04c928-b93c-49a3-a653-f82b5e686da5/openstack-network-exporter/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.370411 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_39601bb5-f2bc-47a6-824a-609c207b963f/openstack-network-exporter/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.564070 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_39601bb5-f2bc-47a6-824a-609c207b963f/ovsdbserver-nb/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.642005 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7b015d0c-8672-450a-a079-965cc4ccd07f/openstack-network-exporter/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.667533 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7b015d0c-8672-450a-a079-965cc4ccd07f/ovsdbserver-sb/0.log" Jan 29 18:05:14 crc kubenswrapper[4886]: I0129 18:05:14.897265 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-795d8c76d8-x2zqv_7e13d48e-3469-4f76-8bae-ab1a21556f5a/placement-api/0.log" Jan 29 18:05:15 crc kubenswrapper[4886]: I0129 18:05:15.258960 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8b3a2d6b-4eb5-44a2-837b-cfbe63f07107/init-config-reloader/0.log" Jan 29 18:05:15 crc kubenswrapper[4886]: I0129 18:05:15.279836 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-795d8c76d8-x2zqv_7e13d48e-3469-4f76-8bae-ab1a21556f5a/placement-log/0.log" Jan 29 18:05:15 crc kubenswrapper[4886]: I0129 18:05:15.418154 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8b3a2d6b-4eb5-44a2-837b-cfbe63f07107/init-config-reloader/0.log" Jan 29 18:05:15 crc kubenswrapper[4886]: I0129 18:05:15.600357 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8b3a2d6b-4eb5-44a2-837b-cfbe63f07107/config-reloader/0.log" Jan 29 18:05:15 crc kubenswrapper[4886]: I0129 18:05:15.780447 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8b3a2d6b-4eb5-44a2-837b-cfbe63f07107/prometheus/0.log" Jan 29 18:05:15 crc kubenswrapper[4886]: I0129 18:05:15.837268 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8b3a2d6b-4eb5-44a2-837b-cfbe63f07107/thanos-sidecar/0.log" Jan 29 18:05:15 crc kubenswrapper[4886]: I0129 18:05:15.961954 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9d0db9ae-746b-419a-bc61-bf85645d2bff/setup-container/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.166125 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9d0db9ae-746b-419a-bc61-bf85645d2bff/setup-container/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.216072 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b0be43b-8956-45aa-ad50-de9183b3fea3/setup-container/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.313744 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9d0db9ae-746b-419a-bc61-bf85645d2bff/rabbitmq/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.415422 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b0be43b-8956-45aa-ad50-de9183b3fea3/setup-container/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.531212 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10/setup-container/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.653660 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b0be43b-8956-45aa-ad50-de9183b3fea3/rabbitmq/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.754393 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10/setup-container/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.795128 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_49ed84c4-2bd9-4fb8-88fe-5bd9fe537a10/rabbitmq/0.log" Jan 29 18:05:16 crc kubenswrapper[4886]: I0129 18:05:16.914654 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_842bfe4d-04ba-4143-9076-3033163c7b82/setup-container/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.095964 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_842bfe4d-04ba-4143-9076-3033163c7b82/setup-container/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.133701 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_842bfe4d-04ba-4143-9076-3033163c7b82/rabbitmq/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.347195 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f458794ff-v7p92_79c81ef9-65c7-4372-9a47-8ed93521eadf/proxy-httpd/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.414799 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f458794ff-v7p92_79c81ef9-65c7-4372-9a47-8ed93521eadf/proxy-server/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.424778 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s7294_ebccb3a0-d421-4c30-9201-43e9106e4006/swift-ring-rebalance/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.684132 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/account-auditor/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.685474 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/account-reaper/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.809069 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/account-replicator/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.869052 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/account-server/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.962355 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/container-auditor/0.log" Jan 29 18:05:17 crc kubenswrapper[4886]: I0129 18:05:17.972476 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/container-replicator/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.037960 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/container-server/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.157610 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/container-updater/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.207915 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/object-auditor/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.261682 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/object-expirer/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.286401 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/object-replicator/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.354705 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/object-server/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.480241 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/swift-recon-cron/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.480434 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/rsync/0.log" Jan 29 18:05:18 crc kubenswrapper[4886]: I0129 18:05:18.494762 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6e2f2c6c-bc32-4a32-ba2c-8954d277ce47/object-updater/0.log" Jan 29 18:05:23 crc kubenswrapper[4886]: I0129 18:05:23.616213 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:05:23 crc kubenswrapper[4886]: E0129 18:05:23.618111 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:05:24 crc kubenswrapper[4886]: I0129 18:05:24.076532 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_88c8ef15-a2b1-41df-8048-752b56d26653/memcached/0.log" Jan 29 18:05:37 crc kubenswrapper[4886]: I0129 18:05:37.614980 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:05:38 crc kubenswrapper[4886]: I0129 18:05:38.717362 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"a8607a4ceafc19dc29f39e1c49905b447674d1829f5c41ef929e075c395f9df6"} Jan 29 18:05:47 crc kubenswrapper[4886]: I0129 18:05:47.120536 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp_c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e/util/0.log" Jan 29 18:05:47 crc kubenswrapper[4886]: I0129 18:05:47.298067 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp_c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e/pull/0.log" Jan 29 18:05:47 crc kubenswrapper[4886]: I0129 18:05:47.305316 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp_c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e/util/0.log" Jan 29 18:05:47 crc kubenswrapper[4886]: I0129 18:05:47.400343 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp_c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e/pull/0.log" Jan 29 18:05:47 crc kubenswrapper[4886]: I0129 18:05:47.725425 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp_c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e/util/0.log" Jan 29 18:05:47 crc kubenswrapper[4886]: I0129 18:05:47.730409 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp_c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e/extract/0.log" Jan 29 18:05:47 crc kubenswrapper[4886]: I0129 18:05:47.735471 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_39139fddf92796f15e1bf79fe958390e5d16e6c9136394aea75c727c23pvldp_c5eb87e5-9a66-4bf3-8348-1dc03c7e0e8e/pull/0.log" Jan 29 18:05:48 crc kubenswrapper[4886]: I0129 18:05:48.027440 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-w6qc6_4e16e340-e213-492a-9c93-851df7b1bddb/manager/0.log" Jan 29 18:05:48 crc kubenswrapper[4886]: I0129 18:05:48.052233 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-2g2cz_3ffc5e8b-7f7a-4585-b43d-07e2589493c9/manager/0.log" Jan 29 18:05:48 crc kubenswrapper[4886]: I0129 18:05:48.150685 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-rhxnz_d01e417c-a1b0-445d-83eb-f3c21a492138/manager/0.log" Jan 29 18:05:48 crc kubenswrapper[4886]: I0129 18:05:48.413168 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-qf2xg_3c56c53e-a292-4e75-b069-c1d06ceeb6c5/manager/0.log" Jan 29 18:05:48 crc kubenswrapper[4886]: I0129 18:05:48.465024 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-pfw9c_02decfa9-69fb-46b5-8b30-30954e39d411/manager/0.log" Jan 29 18:05:48 crc kubenswrapper[4886]: I0129 18:05:48.527619 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-4mmm8_81b8c703-d895-41ce-8ca3-99fd6b6eecb6/manager/0.log" Jan 29 18:05:48 crc kubenswrapper[4886]: I0129 18:05:48.780500 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-77z62_10cac00e-0cd8-4d53-a4dd-3f6b5200e7e0/manager/0.log" Jan 29 18:05:48 crc kubenswrapper[4886]: I0129 18:05:48.884925 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-t5n28_f2898e34-e423-4576-a765-3919510dcd85/manager/0.log" Jan 29 18:05:49 crc kubenswrapper[4886]: I0129 18:05:49.011083 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-kwr4n_67107e9f-cf09-4d35-af26-c77f4d76083a/manager/0.log" Jan 29 18:05:49 crc kubenswrapper[4886]: I0129 18:05:49.126317 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-zpgq2_70336809-8231-4ed9-a912-8b668aaa53bb/manager/0.log" Jan 29 18:05:49 crc kubenswrapper[4886]: I0129 18:05:49.329937 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-c4j5s_4c2d29a3-d017-4e76-9a82-02943a6b38bf/manager/0.log" Jan 29 18:05:49 crc kubenswrapper[4886]: I0129 18:05:49.470930 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-9zqmc_053a2790-370f-44bd-a2c0-603ffb22ed3c/manager/0.log" Jan 29 18:05:49 crc kubenswrapper[4886]: I0129 18:05:49.648843 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-dxcgn_c3cbde0f-6b5d-47cf-93e6-3d2e12051aba/manager/0.log" Jan 29 18:05:49 crc kubenswrapper[4886]: I0129 18:05:49.740416 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-8gq2g_7b52b050-b925-4562-8682-693917b7899c/manager/0.log" Jan 29 18:05:49 crc kubenswrapper[4886]: I0129 18:05:49.852489 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4dqmkhh_c2b6285c-ada4-43f6-8716-53b2afa13723/manager/0.log" Jan 29 18:05:50 crc kubenswrapper[4886]: I0129 18:05:50.090345 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-86bf76f8cb-r9sbf_d4b791b8-523f-4cf0-9ec7-9283c2fd4dde/operator/0.log" Jan 29 18:05:50 crc kubenswrapper[4886]: I0129 18:05:50.333885 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ddcl7_9b2b35ba-9f49-4dd6-816d-6acc4e54e514/registry-server/0.log" Jan 29 18:05:50 crc kubenswrapper[4886]: I0129 18:05:50.573688 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-xnccq_14d9257b-94ae-4b29-b45a-403e034535d3/manager/0.log" Jan 29 18:05:50 crc kubenswrapper[4886]: I0129 18:05:50.971369 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-xt9wq_53042ed9-d676-4bb4-bf7b-9b3520aafd12/manager/0.log" Jan 29 18:05:51 crc kubenswrapper[4886]: I0129 18:05:51.039750 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-546c7b8b6d-hngs4_037bf2ff-dd50-4d62-a525-5304c088cbc0/manager/0.log" Jan 29 18:05:51 crc kubenswrapper[4886]: I0129 18:05:51.126605 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ffdr9_165231a4-c627-484b-9aab-b4ce3feafe7e/operator/0.log" Jan 29 18:05:51 crc kubenswrapper[4886]: I0129 18:05:51.200132 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-cmfj2_608c459b-5b47-478a-9e3a-d83d935ae7c7/manager/0.log" Jan 29 18:05:51 crc kubenswrapper[4886]: I0129 18:05:51.611480 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-hf95f_cbfeb105-c5ee-408e-aac9-e4128e58f0e3/manager/0.log" Jan 29 18:05:51 crc kubenswrapper[4886]: I0129 18:05:51.682022 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-xnrxl_6a145dac-4d02-493c-9bd8-2f9652fcb1d1/manager/0.log" Jan 29 18:05:51 crc kubenswrapper[4886]: I0129 18:05:51.798074 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-75495fd598-2hpj4_7db85474-4c59-4db6-ab4a-51092ebd5c62/manager/0.log" Jan 29 18:06:14 crc kubenswrapper[4886]: I0129 18:06:14.059351 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-l5v6d_009f91e7-865b-400a-a879-4985c84b321c/control-plane-machine-set-operator/0.log" Jan 29 18:06:14 crc kubenswrapper[4886]: I0129 18:06:14.200401 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fgmg6_3510e180-be29-469c-bfa0-b06702f80c93/kube-rbac-proxy/0.log" Jan 29 18:06:14 crc kubenswrapper[4886]: I0129 18:06:14.262174 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fgmg6_3510e180-be29-469c-bfa0-b06702f80c93/machine-api-operator/0.log" Jan 29 18:06:29 crc kubenswrapper[4886]: I0129 18:06:29.651258 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-n8tt2_0eee9f11-c5ff-490b-a5ea-7a62ef8f0a0a/cert-manager-controller/0.log" Jan 29 18:06:29 crc kubenswrapper[4886]: I0129 18:06:29.863222 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-bqffj_f883321e-6f99-4c0d-89ea-377fec9d166c/cert-manager-cainjector/0.log" Jan 29 18:06:30 crc kubenswrapper[4886]: I0129 18:06:30.014213 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-sd87l_a80a9fce-17df-45c6-b123-f3060469c1c9/cert-manager-webhook/0.log" Jan 29 18:06:45 crc kubenswrapper[4886]: I0129 18:06:45.266950 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-d4tp4_2814fca3-5ea5-4b77-aad5-0308881c88bb/nmstate-console-plugin/0.log" Jan 29 18:06:45 crc kubenswrapper[4886]: I0129 18:06:45.454170 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-9lh4n_848b9df5-c882-4017-b1ad-6ac496646a76/nmstate-handler/0.log" Jan 29 18:06:45 crc kubenswrapper[4886]: I0129 18:06:45.496737 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-ntx9m_515c481a-e563-41c3-b5ff-d5957faf5217/kube-rbac-proxy/0.log" Jan 29 18:06:45 crc kubenswrapper[4886]: I0129 18:06:45.577146 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-ntx9m_515c481a-e563-41c3-b5ff-d5957faf5217/nmstate-metrics/0.log" Jan 29 18:06:45 crc kubenswrapper[4886]: I0129 18:06:45.659948 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-xn5zh_64313301-3779-4923-949f-b8de5c30b5bb/nmstate-operator/0.log" Jan 29 18:06:45 crc kubenswrapper[4886]: I0129 18:06:45.786664 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-mv5wp_c42903b0-c0d4-4c39-bed3-3c9d083e753d/nmstate-webhook/0.log" Jan 29 18:07:01 crc kubenswrapper[4886]: I0129 18:07:01.116055 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5b44bcdc44-bgqfw_994fe9e1-7adf-4aab-bc9e-d51fd52286a9/kube-rbac-proxy/0.log" Jan 29 18:07:01 crc kubenswrapper[4886]: I0129 18:07:01.136502 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5b44bcdc44-bgqfw_994fe9e1-7adf-4aab-bc9e-d51fd52286a9/manager/0.log" Jan 29 18:07:15 crc kubenswrapper[4886]: I0129 18:07:15.273393 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-72k5z_1151b336-be43-4e43-959d-463c956e9bc4/prometheus-operator/0.log" Jan 29 18:07:15 crc kubenswrapper[4886]: I0129 18:07:15.475359 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9_e2e7310d-6390-4a0d-b0bd-f8467c80517c/prometheus-operator-admission-webhook/0.log" Jan 29 18:07:15 crc kubenswrapper[4886]: I0129 18:07:15.526640 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5_e1472730-ce1e-4333-a6c6-930196b9d257/prometheus-operator-admission-webhook/0.log" Jan 29 18:07:15 crc kubenswrapper[4886]: I0129 18:07:15.665264 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-w5qml_17549a68-0567-40f8-9dda-37cd61f71b94/operator/0.log" Jan 29 18:07:15 crc kubenswrapper[4886]: I0129 18:07:15.708054 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-ld46c_ee1da890-a690-46b4-95aa-3f282b3cdc30/observability-ui-dashboards/0.log" Jan 29 18:07:15 crc kubenswrapper[4886]: I0129 18:07:15.839557 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dtcpm_d2a26d31-689d-4052-9df2-1654feb68c2d/perses-operator/0.log" Jan 29 18:07:31 crc kubenswrapper[4886]: I0129 18:07:31.923181 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-79cf69ddc8-hgdlt_7f5851a1-d10c-445d-bffc-12a6acc01ead/cluster-logging-operator/0.log" Jan 29 18:07:32 crc kubenswrapper[4886]: I0129 18:07:32.167737 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-qnmmn_bd8dc819-215b-44f5-b758-9bac32be60f5/collector/0.log" Jan 29 18:07:32 crc kubenswrapper[4886]: I0129 18:07:32.295448 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_37c313cd-31f0-4fb3-9241-a3a59b1f55a6/loki-compactor/0.log" Jan 29 18:07:32 crc kubenswrapper[4886]: I0129 18:07:32.369258 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5f678c8dd6-2jzzb_befd63fe-2ae3-4bb3-86fd-ac5486d7fbd1/loki-distributor/0.log" Jan 29 18:07:32 crc kubenswrapper[4886]: I0129 18:07:32.519422 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-8587c9555d-cszl5_c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b/gateway/0.log" Jan 29 18:07:32 crc kubenswrapper[4886]: I0129 18:07:32.528482 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-8587c9555d-cszl5_c39a9c6b-a3a0-4337-9c29-5fa3c161ef0b/opa/0.log" Jan 29 18:07:32 crc kubenswrapper[4886]: I0129 18:07:32.689002 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-8587c9555d-m4k69_046307bd-2e5e-4d92-b934-57ed8882d1bc/gateway/0.log" Jan 29 18:07:32 crc kubenswrapper[4886]: I0129 18:07:32.783226 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-8587c9555d-m4k69_046307bd-2e5e-4d92-b934-57ed8882d1bc/opa/0.log" Jan 29 18:07:32 crc kubenswrapper[4886]: I0129 18:07:32.809886 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_6059a5a7-5b65-481d-9b0f-f40d863e8310/loki-index-gateway/0.log" Jan 29 18:07:33 crc kubenswrapper[4886]: I0129 18:07:33.046771 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_0dd1a523-96c1-4311-9452-92e6da8a7e9b/loki-ingester/0.log" Jan 29 18:07:33 crc kubenswrapper[4886]: I0129 18:07:33.074394 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76788598db-85zgx_fb80c257-3e6a-45c8-bb6f-6fb2676ef296/loki-querier/0.log" Jan 29 18:07:33 crc kubenswrapper[4886]: I0129 18:07:33.278647 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-69d9546745-9q2lr_fa3af54b-5759-4b53-a998-720bd2ff4608/loki-query-frontend/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.037779 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-tlnpb_946b39e6-3f42-4aff-a197-f29de26c175a/kube-rbac-proxy/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.184305 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-tlnpb_946b39e6-3f42-4aff-a197-f29de26c175a/controller/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.241739 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-frr-files/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.504128 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-metrics/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.506224 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-reloader/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.536817 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-frr-files/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.587171 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-reloader/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.748194 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-reloader/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.794223 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-frr-files/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.818898 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-metrics/0.log" Jan 29 18:07:49 crc kubenswrapper[4886]: I0129 18:07:49.828780 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-metrics/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.015875 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-reloader/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.022777 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-metrics/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.030534 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/cp-frr-files/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.038646 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/controller/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.237664 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/kube-rbac-proxy/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.239142 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/frr-metrics/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.279447 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/kube-rbac-proxy-frr/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.478380 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/reloader/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.592017 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-x455w_cf3feb5c-d348-4c0a-95c7-46f18db4687c/frr-k8s-webhook-server/0.log" Jan 29 18:07:50 crc kubenswrapper[4886]: I0129 18:07:50.995018 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-96d4668dd-sb2zt_a88b1900-1763-4d6c-9b3a-62598ab57eda/webhook-server/0.log" Jan 29 18:07:51 crc kubenswrapper[4886]: I0129 18:07:51.030076 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-77cfddbbb9-wbb7k_dc960811-7f19-4248-8d44-e3ffcb98d650/manager/0.log" Jan 29 18:07:51 crc kubenswrapper[4886]: I0129 18:07:51.220669 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bmwgt_5fe12a1b-277f-429e-a6b8-a874ec6e4918/kube-rbac-proxy/0.log" Jan 29 18:07:51 crc kubenswrapper[4886]: I0129 18:07:51.799455 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bmwgt_5fe12a1b-277f-429e-a6b8-a874ec6e4918/speaker/0.log" Jan 29 18:07:52 crc kubenswrapper[4886]: I0129 18:07:52.110703 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4pt6_daa4e7b8-3078-4fd1-bb04-5185fa474080/frr/0.log" Jan 29 18:07:59 crc kubenswrapper[4886]: I0129 18:07:59.660877 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 18:07:59 crc kubenswrapper[4886]: I0129 18:07:59.661368 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 18:08:05 crc kubenswrapper[4886]: I0129 18:08:05.904367 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_aa613edd-15e0-466f-8739-ab30f6d61801/util/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.105496 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_aa613edd-15e0-466f-8739-ab30f6d61801/util/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.143991 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_aa613edd-15e0-466f-8739-ab30f6d61801/pull/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.147540 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_aa613edd-15e0-466f-8739-ab30f6d61801/pull/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.353034 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_aa613edd-15e0-466f-8739-ab30f6d61801/util/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.391244 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_aa613edd-15e0-466f-8739-ab30f6d61801/extract/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.414603 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2fbd5_aa613edd-15e0-466f-8739-ab30f6d61801/pull/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.566130 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn_1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7/util/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.701120 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn_1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7/util/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.718901 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn_1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7/pull/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.753970 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn_1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7/pull/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.932323 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn_1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7/util/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.933280 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn_1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7/pull/0.log" Jan 29 18:08:06 crc kubenswrapper[4886]: I0129 18:08:06.940615 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jqprn_1a97d794-d0ac-4ad5-ae34-d81a8bf7d5e7/extract/0.log" Jan 29 18:08:07 crc kubenswrapper[4886]: I0129 18:08:07.116910 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ws2lm_d8ab6536-f9ab-4191-9c15-f3fe0453e7d0/extract-utilities/0.log" Jan 29 18:08:07 crc kubenswrapper[4886]: I0129 18:08:07.309683 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ws2lm_d8ab6536-f9ab-4191-9c15-f3fe0453e7d0/extract-utilities/0.log" Jan 29 18:08:07 crc kubenswrapper[4886]: I0129 18:08:07.309913 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ws2lm_d8ab6536-f9ab-4191-9c15-f3fe0453e7d0/extract-content/0.log" Jan 29 18:08:07 crc kubenswrapper[4886]: I0129 18:08:07.353354 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ws2lm_d8ab6536-f9ab-4191-9c15-f3fe0453e7d0/extract-content/0.log" Jan 29 18:08:07 crc kubenswrapper[4886]: I0129 18:08:07.496395 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ws2lm_d8ab6536-f9ab-4191-9c15-f3fe0453e7d0/extract-content/0.log" Jan 29 18:08:07 crc kubenswrapper[4886]: I0129 18:08:07.533902 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ws2lm_d8ab6536-f9ab-4191-9c15-f3fe0453e7d0/extract-utilities/0.log" Jan 29 18:08:07 crc kubenswrapper[4886]: I0129 18:08:07.709255 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vnttp_fbfc768f-4803-4f4e-9019-2aacda68bc47/extract-utilities/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.032490 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vnttp_fbfc768f-4803-4f4e-9019-2aacda68bc47/extract-content/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.059040 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vnttp_fbfc768f-4803-4f4e-9019-2aacda68bc47/extract-utilities/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.082759 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vnttp_fbfc768f-4803-4f4e-9019-2aacda68bc47/extract-content/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.233222 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ws2lm_d8ab6536-f9ab-4191-9c15-f3fe0453e7d0/registry-server/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.275267 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vnttp_fbfc768f-4803-4f4e-9019-2aacda68bc47/extract-content/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.311186 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vnttp_fbfc768f-4803-4f4e-9019-2aacda68bc47/extract-utilities/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.462087 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m8snn_9cb13d4a-3940-45ef-9135-ff94c6a75b0c/marketplace-operator/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.654976 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52bfx_87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca/extract-utilities/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.941514 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52bfx_87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca/extract-content/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.978002 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52bfx_87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca/extract-utilities/0.log" Jan 29 18:08:08 crc kubenswrapper[4886]: I0129 18:08:08.992603 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vnttp_fbfc768f-4803-4f4e-9019-2aacda68bc47/registry-server/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.021748 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52bfx_87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca/extract-content/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.261578 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52bfx_87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca/extract-utilities/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.284477 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52bfx_87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca/extract-content/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.460599 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52bfx_87b65e80-b30f-4ac4-bb06-ec8eb04cd7ca/registry-server/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.474008 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6bdhs_80e49770-fa31-4780-a5ac-38a6bc1221a9/extract-utilities/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.672239 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6bdhs_80e49770-fa31-4780-a5ac-38a6bc1221a9/extract-utilities/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.712048 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6bdhs_80e49770-fa31-4780-a5ac-38a6bc1221a9/extract-content/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.732662 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6bdhs_80e49770-fa31-4780-a5ac-38a6bc1221a9/extract-content/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.888223 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6bdhs_80e49770-fa31-4780-a5ac-38a6bc1221a9/extract-content/0.log" Jan 29 18:08:09 crc kubenswrapper[4886]: I0129 18:08:09.949181 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6bdhs_80e49770-fa31-4780-a5ac-38a6bc1221a9/extract-utilities/0.log" Jan 29 18:08:10 crc kubenswrapper[4886]: I0129 18:08:10.511014 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6bdhs_80e49770-fa31-4780-a5ac-38a6bc1221a9/registry-server/0.log" Jan 29 18:08:25 crc kubenswrapper[4886]: I0129 18:08:25.674222 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78f4cbbdd9-hrhb5_e1472730-ce1e-4333-a6c6-930196b9d257/prometheus-operator-admission-webhook/0.log" Jan 29 18:08:25 crc kubenswrapper[4886]: I0129 18:08:25.697564 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-72k5z_1151b336-be43-4e43-959d-463c956e9bc4/prometheus-operator/0.log" Jan 29 18:08:25 crc kubenswrapper[4886]: I0129 18:08:25.717232 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78f4cbbdd9-75xq9_e2e7310d-6390-4a0d-b0bd-f8467c80517c/prometheus-operator-admission-webhook/0.log" Jan 29 18:08:26 crc kubenswrapper[4886]: I0129 18:08:26.065439 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-ld46c_ee1da890-a690-46b4-95aa-3f282b3cdc30/observability-ui-dashboards/0.log" Jan 29 18:08:26 crc kubenswrapper[4886]: I0129 18:08:26.113169 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-w5qml_17549a68-0567-40f8-9dda-37cd61f71b94/operator/0.log" Jan 29 18:08:26 crc kubenswrapper[4886]: I0129 18:08:26.187501 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dtcpm_d2a26d31-689d-4052-9df2-1654feb68c2d/perses-operator/0.log" Jan 29 18:08:29 crc kubenswrapper[4886]: I0129 18:08:29.660785 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 18:08:29 crc kubenswrapper[4886]: I0129 18:08:29.661471 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 18:08:41 crc kubenswrapper[4886]: I0129 18:08:41.856318 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5b44bcdc44-bgqfw_994fe9e1-7adf-4aab-bc9e-d51fd52286a9/manager/0.log" Jan 29 18:08:41 crc kubenswrapper[4886]: I0129 18:08:41.905761 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5b44bcdc44-bgqfw_994fe9e1-7adf-4aab-bc9e-d51fd52286a9/kube-rbac-proxy/0.log" Jan 29 18:08:59 crc kubenswrapper[4886]: I0129 18:08:59.660546 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 18:08:59 crc kubenswrapper[4886]: I0129 18:08:59.660977 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 18:08:59 crc kubenswrapper[4886]: I0129 18:08:59.661025 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 18:08:59 crc kubenswrapper[4886]: I0129 18:08:59.661549 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a8607a4ceafc19dc29f39e1c49905b447674d1829f5c41ef929e075c395f9df6"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 18:08:59 crc kubenswrapper[4886]: I0129 18:08:59.661592 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://a8607a4ceafc19dc29f39e1c49905b447674d1829f5c41ef929e075c395f9df6" gracePeriod=600 Jan 29 18:08:59 crc kubenswrapper[4886]: I0129 18:08:59.858207 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="a8607a4ceafc19dc29f39e1c49905b447674d1829f5c41ef929e075c395f9df6" exitCode=0 Jan 29 18:08:59 crc kubenswrapper[4886]: I0129 18:08:59.858303 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"a8607a4ceafc19dc29f39e1c49905b447674d1829f5c41ef929e075c395f9df6"} Jan 29 18:08:59 crc kubenswrapper[4886]: I0129 18:08:59.858616 4886 scope.go:117] "RemoveContainer" containerID="d68f7ec6ceb9d5c0ab55fbdd924d4866f80618e90c6f48af98c7c175db4cf62a" Jan 29 18:09:00 crc kubenswrapper[4886]: I0129 18:09:00.887951 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerStarted","Data":"b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b"} Jan 29 18:09:56 crc kubenswrapper[4886]: I0129 18:09:56.916208 4886 scope.go:117] "RemoveContainer" containerID="9151f75a515b793b76d61e304966261ea994214c86da5ff66a0d5a788f6197a1" Jan 29 18:10:19 crc kubenswrapper[4886]: I0129 18:10:19.996795 4886 generic.go:334] "Generic (PLEG): container finished" podID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerID="2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71" exitCode=0 Jan 29 18:10:19 crc kubenswrapper[4886]: I0129 18:10:19.996874 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsq2b/must-gather-jss9f" event={"ID":"fd01fd0d-8339-41ba-be01-6c3b723b2ec9","Type":"ContainerDied","Data":"2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71"} Jan 29 18:10:19 crc kubenswrapper[4886]: I0129 18:10:19.998759 4886 scope.go:117] "RemoveContainer" containerID="2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71" Jan 29 18:10:20 crc kubenswrapper[4886]: I0129 18:10:20.416736 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lsq2b_must-gather-jss9f_fd01fd0d-8339-41ba-be01-6c3b723b2ec9/gather/0.log" Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.077362 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lsq2b/must-gather-jss9f"] Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.078174 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lsq2b/must-gather-jss9f" podUID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerName="copy" containerID="cri-o://941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480" gracePeriod=2 Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.085307 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lsq2b/must-gather-jss9f"] Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.616491 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lsq2b_must-gather-jss9f_fd01fd0d-8339-41ba-be01-6c3b723b2ec9/copy/0.log" Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.630226 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.755964 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-must-gather-output\") pod \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\" (UID: \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\") " Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.756032 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv85m\" (UniqueName: \"kubernetes.io/projected/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-kube-api-access-lv85m\") pod \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\" (UID: \"fd01fd0d-8339-41ba-be01-6c3b723b2ec9\") " Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.764574 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-kube-api-access-lv85m" (OuterVolumeSpecName: "kube-api-access-lv85m") pod "fd01fd0d-8339-41ba-be01-6c3b723b2ec9" (UID: "fd01fd0d-8339-41ba-be01-6c3b723b2ec9"). InnerVolumeSpecName "kube-api-access-lv85m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.859157 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv85m\" (UniqueName: \"kubernetes.io/projected/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-kube-api-access-lv85m\") on node \"crc\" DevicePath \"\"" Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.949842 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fd01fd0d-8339-41ba-be01-6c3b723b2ec9" (UID: "fd01fd0d-8339-41ba-be01-6c3b723b2ec9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:10:28 crc kubenswrapper[4886]: I0129 18:10:28.960364 4886 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd01fd0d-8339-41ba-be01-6c3b723b2ec9-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.096072 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lsq2b_must-gather-jss9f_fd01fd0d-8339-41ba-be01-6c3b723b2ec9/copy/0.log" Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.097247 4886 generic.go:334] "Generic (PLEG): container finished" podID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerID="941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480" exitCode=143 Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.097307 4886 scope.go:117] "RemoveContainer" containerID="941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480" Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.097443 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsq2b/must-gather-jss9f" Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.123746 4886 scope.go:117] "RemoveContainer" containerID="2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71" Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.189383 4886 scope.go:117] "RemoveContainer" containerID="941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480" Jan 29 18:10:29 crc kubenswrapper[4886]: E0129 18:10:29.190082 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480\": container with ID starting with 941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480 not found: ID does not exist" containerID="941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480" Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.190131 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480"} err="failed to get container status \"941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480\": rpc error: code = NotFound desc = could not find container \"941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480\": container with ID starting with 941c9f11cb71ba19e856bc997a9757714af5c5ee6eb22fb06be9c6d2f5939480 not found: ID does not exist" Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.190158 4886 scope.go:117] "RemoveContainer" containerID="2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71" Jan 29 18:10:29 crc kubenswrapper[4886]: E0129 18:10:29.190604 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71\": container with ID starting with 2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71 not found: ID does not exist" containerID="2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71" Jan 29 18:10:29 crc kubenswrapper[4886]: I0129 18:10:29.190667 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71"} err="failed to get container status \"2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71\": rpc error: code = NotFound desc = could not find container \"2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71\": container with ID starting with 2738216c87f4889a48f2223f13ba05e092ed8aee10ab356bb6e1bc6a50ac2a71 not found: ID does not exist" Jan 29 18:10:30 crc kubenswrapper[4886]: I0129 18:10:30.652448 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" path="/var/lib/kubelet/pods/fd01fd0d-8339-41ba-be01-6c3b723b2ec9/volumes" Jan 29 18:11:29 crc kubenswrapper[4886]: I0129 18:11:29.660620 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 18:11:29 crc kubenswrapper[4886]: I0129 18:11:29.661171 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.244307 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xm9wv"] Jan 29 18:11:38 crc kubenswrapper[4886]: E0129 18:11:38.249562 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" containerName="registry-server" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.249586 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" containerName="registry-server" Jan 29 18:11:38 crc kubenswrapper[4886]: E0129 18:11:38.249612 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerName="copy" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.249620 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerName="copy" Jan 29 18:11:38 crc kubenswrapper[4886]: E0129 18:11:38.249642 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" containerName="extract-utilities" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.249651 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" containerName="extract-utilities" Jan 29 18:11:38 crc kubenswrapper[4886]: E0129 18:11:38.249669 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="registry-server" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.249678 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="registry-server" Jan 29 18:11:38 crc kubenswrapper[4886]: E0129 18:11:38.249696 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" containerName="extract-content" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.249704 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" containerName="extract-content" Jan 29 18:11:38 crc kubenswrapper[4886]: E0129 18:11:38.249715 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerName="gather" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.249722 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerName="gather" Jan 29 18:11:38 crc kubenswrapper[4886]: E0129 18:11:38.249738 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="extract-utilities" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.249747 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="extract-utilities" Jan 29 18:11:38 crc kubenswrapper[4886]: E0129 18:11:38.249771 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="extract-content" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.249780 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="extract-content" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.250665 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerName="gather" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.250695 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace6b3f5-2f50-4320-87db-40229f5f2cfa" containerName="registry-server" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.250731 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd01fd0d-8339-41ba-be01-6c3b723b2ec9" containerName="copy" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.250745 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf73c735-d3aa-476b-9390-6a150d51a290" containerName="registry-server" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.254436 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.263721 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xm9wv"] Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.354156 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-utilities\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.354417 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-catalog-content\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.354463 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qctt9\" (UniqueName: \"kubernetes.io/projected/e1a40aab-9df6-46b7-ae77-30f27474304d-kube-api-access-qctt9\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.456784 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-catalog-content\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.456863 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qctt9\" (UniqueName: \"kubernetes.io/projected/e1a40aab-9df6-46b7-ae77-30f27474304d-kube-api-access-qctt9\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.456982 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-utilities\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.457414 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-utilities\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.457560 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-catalog-content\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.493587 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qctt9\" (UniqueName: \"kubernetes.io/projected/e1a40aab-9df6-46b7-ae77-30f27474304d-kube-api-access-qctt9\") pod \"community-operators-xm9wv\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:38 crc kubenswrapper[4886]: I0129 18:11:38.591732 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:11:39 crc kubenswrapper[4886]: I0129 18:11:39.189876 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xm9wv"] Jan 29 18:11:40 crc kubenswrapper[4886]: I0129 18:11:40.054889 4886 generic.go:334] "Generic (PLEG): container finished" podID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerID="73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd" exitCode=0 Jan 29 18:11:40 crc kubenswrapper[4886]: I0129 18:11:40.055006 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm9wv" event={"ID":"e1a40aab-9df6-46b7-ae77-30f27474304d","Type":"ContainerDied","Data":"73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd"} Jan 29 18:11:40 crc kubenswrapper[4886]: I0129 18:11:40.055278 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm9wv" event={"ID":"e1a40aab-9df6-46b7-ae77-30f27474304d","Type":"ContainerStarted","Data":"95030769d112ca044ae55f690f43571603a8addbf7e4c6fa08a67c0409685a38"} Jan 29 18:11:40 crc kubenswrapper[4886]: I0129 18:11:40.058557 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 18:11:40 crc kubenswrapper[4886]: E0129 18:11:40.222666 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 18:11:40 crc kubenswrapper[4886]: E0129 18:11:40.222913 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qctt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xm9wv_openshift-marketplace(e1a40aab-9df6-46b7-ae77-30f27474304d): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 18:11:40 crc kubenswrapper[4886]: E0129 18:11:40.224227 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:11:41 crc kubenswrapper[4886]: E0129 18:11:41.070347 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:11:52 crc kubenswrapper[4886]: E0129 18:11:52.761776 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 18:11:52 crc kubenswrapper[4886]: E0129 18:11:52.762248 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qctt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xm9wv_openshift-marketplace(e1a40aab-9df6-46b7-ae77-30f27474304d): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 18:11:52 crc kubenswrapper[4886]: E0129 18:11:52.763415 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:11:59 crc kubenswrapper[4886]: I0129 18:11:59.661594 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 18:11:59 crc kubenswrapper[4886]: I0129 18:11:59.662299 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 18:12:07 crc kubenswrapper[4886]: E0129 18:12:07.622743 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:12:18 crc kubenswrapper[4886]: E0129 18:12:18.769713 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 18:12:18 crc kubenswrapper[4886]: E0129 18:12:18.770779 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qctt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xm9wv_openshift-marketplace(e1a40aab-9df6-46b7-ae77-30f27474304d): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 18:12:18 crc kubenswrapper[4886]: E0129 18:12:18.772081 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:12:29 crc kubenswrapper[4886]: I0129 18:12:29.661088 4886 patch_prober.go:28] interesting pod/machine-config-daemon-gx4vp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 18:12:29 crc kubenswrapper[4886]: I0129 18:12:29.661726 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 18:12:29 crc kubenswrapper[4886]: I0129 18:12:29.661785 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" Jan 29 18:12:29 crc kubenswrapper[4886]: I0129 18:12:29.662732 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b"} pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 18:12:29 crc kubenswrapper[4886]: I0129 18:12:29.662835 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerName="machine-config-daemon" containerID="cri-o://b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" gracePeriod=600 Jan 29 18:12:29 crc kubenswrapper[4886]: E0129 18:12:29.787838 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:12:30 crc kubenswrapper[4886]: I0129 18:12:30.704278 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" exitCode=0 Jan 29 18:12:30 crc kubenswrapper[4886]: I0129 18:12:30.704355 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" event={"ID":"5a5d8fc0-7aa5-431a-9add-9bdcc6d20091","Type":"ContainerDied","Data":"b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b"} Jan 29 18:12:30 crc kubenswrapper[4886]: I0129 18:12:30.704647 4886 scope.go:117] "RemoveContainer" containerID="a8607a4ceafc19dc29f39e1c49905b447674d1829f5c41ef929e075c395f9df6" Jan 29 18:12:30 crc kubenswrapper[4886]: I0129 18:12:30.705944 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:12:30 crc kubenswrapper[4886]: E0129 18:12:30.706654 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:12:32 crc kubenswrapper[4886]: E0129 18:12:32.618213 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:12:41 crc kubenswrapper[4886]: I0129 18:12:41.616303 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:12:41 crc kubenswrapper[4886]: E0129 18:12:41.617660 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:12:43 crc kubenswrapper[4886]: E0129 18:12:43.619106 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:12:55 crc kubenswrapper[4886]: E0129 18:12:55.620004 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:12:56 crc kubenswrapper[4886]: I0129 18:12:56.616629 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:12:56 crc kubenswrapper[4886]: E0129 18:12:56.617870 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:13:08 crc kubenswrapper[4886]: I0129 18:13:08.630575 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:13:08 crc kubenswrapper[4886]: E0129 18:13:08.631699 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:13:08 crc kubenswrapper[4886]: E0129 18:13:08.766448 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 18:13:08 crc kubenswrapper[4886]: E0129 18:13:08.766733 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qctt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xm9wv_openshift-marketplace(e1a40aab-9df6-46b7-ae77-30f27474304d): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 18:13:08 crc kubenswrapper[4886]: E0129 18:13:08.768096 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:13:19 crc kubenswrapper[4886]: I0129 18:13:19.615667 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:13:19 crc kubenswrapper[4886]: E0129 18:13:19.616692 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:13:19 crc kubenswrapper[4886]: E0129 18:13:19.617791 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:13:34 crc kubenswrapper[4886]: I0129 18:13:34.615292 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:13:34 crc kubenswrapper[4886]: E0129 18:13:34.616642 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:13:34 crc kubenswrapper[4886]: E0129 18:13:34.618374 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:13:45 crc kubenswrapper[4886]: I0129 18:13:45.615977 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:13:45 crc kubenswrapper[4886]: E0129 18:13:45.619749 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:13:46 crc kubenswrapper[4886]: E0129 18:13:46.617227 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:13:57 crc kubenswrapper[4886]: I0129 18:13:57.615739 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:13:57 crc kubenswrapper[4886]: E0129 18:13:57.616539 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:14:01 crc kubenswrapper[4886]: E0129 18:14:01.618777 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:14:10 crc kubenswrapper[4886]: I0129 18:14:10.618717 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:14:10 crc kubenswrapper[4886]: E0129 18:14:10.619794 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:14:15 crc kubenswrapper[4886]: E0129 18:14:15.619765 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.397484 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n562p"] Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.402419 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.412580 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n562p"] Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.532718 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngjw\" (UniqueName: \"kubernetes.io/projected/cab12cac-196d-4567-b193-dbfe7e5dceac-kube-api-access-gngjw\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.533466 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-catalog-content\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.533621 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-utilities\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.649006 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gngjw\" (UniqueName: \"kubernetes.io/projected/cab12cac-196d-4567-b193-dbfe7e5dceac-kube-api-access-gngjw\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.649273 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-catalog-content\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.649296 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-utilities\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.651372 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-utilities\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.652018 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-catalog-content\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.674136 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngjw\" (UniqueName: \"kubernetes.io/projected/cab12cac-196d-4567-b193-dbfe7e5dceac-kube-api-access-gngjw\") pod \"redhat-marketplace-n562p\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:22 crc kubenswrapper[4886]: I0129 18:14:22.741059 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:23 crc kubenswrapper[4886]: I0129 18:14:23.271769 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n562p"] Jan 29 18:14:23 crc kubenswrapper[4886]: I0129 18:14:23.615056 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:14:23 crc kubenswrapper[4886]: E0129 18:14:23.615420 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:14:24 crc kubenswrapper[4886]: I0129 18:14:24.248936 4886 generic.go:334] "Generic (PLEG): container finished" podID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerID="293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0" exitCode=0 Jan 29 18:14:24 crc kubenswrapper[4886]: I0129 18:14:24.249025 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n562p" event={"ID":"cab12cac-196d-4567-b193-dbfe7e5dceac","Type":"ContainerDied","Data":"293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0"} Jan 29 18:14:24 crc kubenswrapper[4886]: I0129 18:14:24.249283 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n562p" event={"ID":"cab12cac-196d-4567-b193-dbfe7e5dceac","Type":"ContainerStarted","Data":"6bcaff8425100bf86063574e42241279b47ae8277b9085c7d68e2ea73533940c"} Jan 29 18:14:26 crc kubenswrapper[4886]: I0129 18:14:26.269019 4886 generic.go:334] "Generic (PLEG): container finished" podID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerID="46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99" exitCode=0 Jan 29 18:14:26 crc kubenswrapper[4886]: I0129 18:14:26.269085 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n562p" event={"ID":"cab12cac-196d-4567-b193-dbfe7e5dceac","Type":"ContainerDied","Data":"46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99"} Jan 29 18:14:27 crc kubenswrapper[4886]: I0129 18:14:27.283645 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n562p" event={"ID":"cab12cac-196d-4567-b193-dbfe7e5dceac","Type":"ContainerStarted","Data":"b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2"} Jan 29 18:14:27 crc kubenswrapper[4886]: I0129 18:14:27.301030 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n562p" podStartSLOduration=2.856990968 podStartE2EDuration="5.301011937s" podCreationTimestamp="2026-01-29 18:14:22 +0000 UTC" firstStartedPulling="2026-01-29 18:14:24.251237319 +0000 UTC m=+6747.159956611" lastFinishedPulling="2026-01-29 18:14:26.695258268 +0000 UTC m=+6749.603977580" observedRunningTime="2026-01-29 18:14:27.300026349 +0000 UTC m=+6750.208745631" watchObservedRunningTime="2026-01-29 18:14:27.301011937 +0000 UTC m=+6750.209731209" Jan 29 18:14:31 crc kubenswrapper[4886]: I0129 18:14:31.336459 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm9wv" event={"ID":"e1a40aab-9df6-46b7-ae77-30f27474304d","Type":"ContainerStarted","Data":"15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f"} Jan 29 18:14:32 crc kubenswrapper[4886]: I0129 18:14:32.354706 4886 generic.go:334] "Generic (PLEG): container finished" podID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerID="15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f" exitCode=0 Jan 29 18:14:32 crc kubenswrapper[4886]: I0129 18:14:32.354771 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm9wv" event={"ID":"e1a40aab-9df6-46b7-ae77-30f27474304d","Type":"ContainerDied","Data":"15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f"} Jan 29 18:14:32 crc kubenswrapper[4886]: I0129 18:14:32.741834 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:32 crc kubenswrapper[4886]: I0129 18:14:32.741991 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:32 crc kubenswrapper[4886]: I0129 18:14:32.803773 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:33 crc kubenswrapper[4886]: I0129 18:14:33.366049 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm9wv" event={"ID":"e1a40aab-9df6-46b7-ae77-30f27474304d","Type":"ContainerStarted","Data":"66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3"} Jan 29 18:14:33 crc kubenswrapper[4886]: I0129 18:14:33.393174 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xm9wv" podStartSLOduration=2.668651144 podStartE2EDuration="2m55.393156634s" podCreationTimestamp="2026-01-29 18:11:38 +0000 UTC" firstStartedPulling="2026-01-29 18:11:40.057970736 +0000 UTC m=+6582.966690038" lastFinishedPulling="2026-01-29 18:14:32.782476256 +0000 UTC m=+6755.691195528" observedRunningTime="2026-01-29 18:14:33.386404753 +0000 UTC m=+6756.295124055" watchObservedRunningTime="2026-01-29 18:14:33.393156634 +0000 UTC m=+6756.301875906" Jan 29 18:14:33 crc kubenswrapper[4886]: I0129 18:14:33.421475 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:34 crc kubenswrapper[4886]: I0129 18:14:34.610390 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n562p"] Jan 29 18:14:35 crc kubenswrapper[4886]: I0129 18:14:35.389642 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n562p" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerName="registry-server" containerID="cri-o://b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2" gracePeriod=2 Jan 29 18:14:35 crc kubenswrapper[4886]: I0129 18:14:35.980707 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.116947 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gngjw\" (UniqueName: \"kubernetes.io/projected/cab12cac-196d-4567-b193-dbfe7e5dceac-kube-api-access-gngjw\") pod \"cab12cac-196d-4567-b193-dbfe7e5dceac\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.117047 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-utilities\") pod \"cab12cac-196d-4567-b193-dbfe7e5dceac\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.117407 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-catalog-content\") pod \"cab12cac-196d-4567-b193-dbfe7e5dceac\" (UID: \"cab12cac-196d-4567-b193-dbfe7e5dceac\") " Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.118898 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-utilities" (OuterVolumeSpecName: "utilities") pod "cab12cac-196d-4567-b193-dbfe7e5dceac" (UID: "cab12cac-196d-4567-b193-dbfe7e5dceac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.122000 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cab12cac-196d-4567-b193-dbfe7e5dceac-kube-api-access-gngjw" (OuterVolumeSpecName: "kube-api-access-gngjw") pod "cab12cac-196d-4567-b193-dbfe7e5dceac" (UID: "cab12cac-196d-4567-b193-dbfe7e5dceac"). InnerVolumeSpecName "kube-api-access-gngjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.164500 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cab12cac-196d-4567-b193-dbfe7e5dceac" (UID: "cab12cac-196d-4567-b193-dbfe7e5dceac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.220955 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.221009 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gngjw\" (UniqueName: \"kubernetes.io/projected/cab12cac-196d-4567-b193-dbfe7e5dceac-kube-api-access-gngjw\") on node \"crc\" DevicePath \"\"" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.221034 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cab12cac-196d-4567-b193-dbfe7e5dceac-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.403502 4886 generic.go:334] "Generic (PLEG): container finished" podID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerID="b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2" exitCode=0 Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.403573 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n562p" event={"ID":"cab12cac-196d-4567-b193-dbfe7e5dceac","Type":"ContainerDied","Data":"b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2"} Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.403610 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n562p" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.403652 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n562p" event={"ID":"cab12cac-196d-4567-b193-dbfe7e5dceac","Type":"ContainerDied","Data":"6bcaff8425100bf86063574e42241279b47ae8277b9085c7d68e2ea73533940c"} Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.403700 4886 scope.go:117] "RemoveContainer" containerID="b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.438249 4886 scope.go:117] "RemoveContainer" containerID="46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.466254 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n562p"] Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.477139 4886 scope.go:117] "RemoveContainer" containerID="293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.483426 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n562p"] Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.543420 4886 scope.go:117] "RemoveContainer" containerID="b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2" Jan 29 18:14:36 crc kubenswrapper[4886]: E0129 18:14:36.543898 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2\": container with ID starting with b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2 not found: ID does not exist" containerID="b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.543957 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2"} err="failed to get container status \"b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2\": rpc error: code = NotFound desc = could not find container \"b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2\": container with ID starting with b6e921c2222268cb755607604fd8c0b3efbdfa3b43c1c202e82106f447a7a6b2 not found: ID does not exist" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.544004 4886 scope.go:117] "RemoveContainer" containerID="46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99" Jan 29 18:14:36 crc kubenswrapper[4886]: E0129 18:14:36.545254 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99\": container with ID starting with 46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99 not found: ID does not exist" containerID="46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.545308 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99"} err="failed to get container status \"46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99\": rpc error: code = NotFound desc = could not find container \"46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99\": container with ID starting with 46e4ac4967e8890f9ea1b048f9f0023744ebfb614ed27aad57d597bc3d686c99 not found: ID does not exist" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.545366 4886 scope.go:117] "RemoveContainer" containerID="293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0" Jan 29 18:14:36 crc kubenswrapper[4886]: E0129 18:14:36.546113 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0\": container with ID starting with 293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0 not found: ID does not exist" containerID="293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.546163 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0"} err="failed to get container status \"293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0\": rpc error: code = NotFound desc = could not find container \"293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0\": container with ID starting with 293d46bccbff7998923fdc0bd1e4d2e70801401dcf85ea823ef008fcadc6dea0 not found: ID does not exist" Jan 29 18:14:36 crc kubenswrapper[4886]: I0129 18:14:36.638290 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" path="/var/lib/kubelet/pods/cab12cac-196d-4567-b193-dbfe7e5dceac/volumes" Jan 29 18:14:37 crc kubenswrapper[4886]: I0129 18:14:37.616568 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:14:37 crc kubenswrapper[4886]: E0129 18:14:37.617166 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:14:38 crc kubenswrapper[4886]: I0129 18:14:38.591921 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:14:38 crc kubenswrapper[4886]: I0129 18:14:38.591991 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:14:38 crc kubenswrapper[4886]: I0129 18:14:38.675917 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:14:39 crc kubenswrapper[4886]: I0129 18:14:39.553453 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:14:40 crc kubenswrapper[4886]: I0129 18:14:40.012647 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xm9wv"] Jan 29 18:14:41 crc kubenswrapper[4886]: I0129 18:14:41.487691 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xm9wv" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerName="registry-server" containerID="cri-o://66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3" gracePeriod=2 Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.111949 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.177303 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-catalog-content\") pod \"e1a40aab-9df6-46b7-ae77-30f27474304d\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.177639 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-utilities\") pod \"e1a40aab-9df6-46b7-ae77-30f27474304d\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.177885 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qctt9\" (UniqueName: \"kubernetes.io/projected/e1a40aab-9df6-46b7-ae77-30f27474304d-kube-api-access-qctt9\") pod \"e1a40aab-9df6-46b7-ae77-30f27474304d\" (UID: \"e1a40aab-9df6-46b7-ae77-30f27474304d\") " Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.179025 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-utilities" (OuterVolumeSpecName: "utilities") pod "e1a40aab-9df6-46b7-ae77-30f27474304d" (UID: "e1a40aab-9df6-46b7-ae77-30f27474304d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.188291 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a40aab-9df6-46b7-ae77-30f27474304d-kube-api-access-qctt9" (OuterVolumeSpecName: "kube-api-access-qctt9") pod "e1a40aab-9df6-46b7-ae77-30f27474304d" (UID: "e1a40aab-9df6-46b7-ae77-30f27474304d"). InnerVolumeSpecName "kube-api-access-qctt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.246689 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1a40aab-9df6-46b7-ae77-30f27474304d" (UID: "e1a40aab-9df6-46b7-ae77-30f27474304d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.281348 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.281383 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a40aab-9df6-46b7-ae77-30f27474304d-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.281393 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qctt9\" (UniqueName: \"kubernetes.io/projected/e1a40aab-9df6-46b7-ae77-30f27474304d-kube-api-access-qctt9\") on node \"crc\" DevicePath \"\"" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.498681 4886 generic.go:334] "Generic (PLEG): container finished" podID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerID="66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3" exitCode=0 Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.498722 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm9wv" event={"ID":"e1a40aab-9df6-46b7-ae77-30f27474304d","Type":"ContainerDied","Data":"66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3"} Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.498733 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xm9wv" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.498747 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm9wv" event={"ID":"e1a40aab-9df6-46b7-ae77-30f27474304d","Type":"ContainerDied","Data":"95030769d112ca044ae55f690f43571603a8addbf7e4c6fa08a67c0409685a38"} Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.498763 4886 scope.go:117] "RemoveContainer" containerID="66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.522529 4886 scope.go:117] "RemoveContainer" containerID="15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.542904 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xm9wv"] Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.551928 4886 scope.go:117] "RemoveContainer" containerID="73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.554557 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xm9wv"] Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.602116 4886 scope.go:117] "RemoveContainer" containerID="66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3" Jan 29 18:14:42 crc kubenswrapper[4886]: E0129 18:14:42.602721 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3\": container with ID starting with 66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3 not found: ID does not exist" containerID="66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.602759 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3"} err="failed to get container status \"66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3\": rpc error: code = NotFound desc = could not find container \"66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3\": container with ID starting with 66a49c321d01b60ab3f2c9f17e95bf959950c32860ff7b7351a0c02780afd5b3 not found: ID does not exist" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.602790 4886 scope.go:117] "RemoveContainer" containerID="15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f" Jan 29 18:14:42 crc kubenswrapper[4886]: E0129 18:14:42.603251 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f\": container with ID starting with 15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f not found: ID does not exist" containerID="15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.603301 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f"} err="failed to get container status \"15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f\": rpc error: code = NotFound desc = could not find container \"15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f\": container with ID starting with 15dfacf562334e503bf98f0b143227cfe8c6890ca73ff759622e84fbb0b7592f not found: ID does not exist" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.603368 4886 scope.go:117] "RemoveContainer" containerID="73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd" Jan 29 18:14:42 crc kubenswrapper[4886]: E0129 18:14:42.603768 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd\": container with ID starting with 73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd not found: ID does not exist" containerID="73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.603805 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd"} err="failed to get container status \"73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd\": rpc error: code = NotFound desc = could not find container \"73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd\": container with ID starting with 73a76d9bf9407207bb16286ea217fd9e932d96ee9e61b5e551d230717409c7fd not found: ID does not exist" Jan 29 18:14:42 crc kubenswrapper[4886]: I0129 18:14:42.634464 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" path="/var/lib/kubelet/pods/e1a40aab-9df6-46b7-ae77-30f27474304d/volumes" Jan 29 18:14:50 crc kubenswrapper[4886]: I0129 18:14:50.615773 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:14:50 crc kubenswrapper[4886]: E0129 18:14:50.616995 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.157992 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59"] Jan 29 18:15:00 crc kubenswrapper[4886]: E0129 18:15:00.159241 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerName="extract-content" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.159258 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerName="extract-content" Jan 29 18:15:00 crc kubenswrapper[4886]: E0129 18:15:00.159287 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerName="registry-server" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.159296 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerName="registry-server" Jan 29 18:15:00 crc kubenswrapper[4886]: E0129 18:15:00.159313 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerName="registry-server" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.159342 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerName="registry-server" Jan 29 18:15:00 crc kubenswrapper[4886]: E0129 18:15:00.159380 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerName="extract-utilities" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.159390 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerName="extract-utilities" Jan 29 18:15:00 crc kubenswrapper[4886]: E0129 18:15:00.159420 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerName="extract-utilities" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.159429 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerName="extract-utilities" Jan 29 18:15:00 crc kubenswrapper[4886]: E0129 18:15:00.159444 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerName="extract-content" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.159452 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerName="extract-content" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.159735 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a40aab-9df6-46b7-ae77-30f27474304d" containerName="registry-server" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.159777 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cab12cac-196d-4567-b193-dbfe7e5dceac" containerName="registry-server" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.160709 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.163862 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.164403 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.177635 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59"] Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.267157 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dfd79b6-b491-45ff-9977-93e384a500a7-secret-volume\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.267405 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dfd79b6-b491-45ff-9977-93e384a500a7-config-volume\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.267467 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58bxl\" (UniqueName: \"kubernetes.io/projected/0dfd79b6-b491-45ff-9977-93e384a500a7-kube-api-access-58bxl\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.369793 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dfd79b6-b491-45ff-9977-93e384a500a7-config-volume\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.370192 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58bxl\" (UniqueName: \"kubernetes.io/projected/0dfd79b6-b491-45ff-9977-93e384a500a7-kube-api-access-58bxl\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.370523 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dfd79b6-b491-45ff-9977-93e384a500a7-secret-volume\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.371322 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dfd79b6-b491-45ff-9977-93e384a500a7-config-volume\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.384298 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dfd79b6-b491-45ff-9977-93e384a500a7-secret-volume\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.395088 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58bxl\" (UniqueName: \"kubernetes.io/projected/0dfd79b6-b491-45ff-9977-93e384a500a7-kube-api-access-58bxl\") pod \"collect-profiles-29495175-dxl59\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:00 crc kubenswrapper[4886]: I0129 18:15:00.493925 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:01 crc kubenswrapper[4886]: I0129 18:15:00.998784 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59"] Jan 29 18:15:01 crc kubenswrapper[4886]: I0129 18:15:01.740243 4886 generic.go:334] "Generic (PLEG): container finished" podID="0dfd79b6-b491-45ff-9977-93e384a500a7" containerID="46ec392561fcd2b377c433df08c45dc60e9a2f469268a1ca308bb44ae0fd25e0" exitCode=0 Jan 29 18:15:01 crc kubenswrapper[4886]: I0129 18:15:01.740337 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" event={"ID":"0dfd79b6-b491-45ff-9977-93e384a500a7","Type":"ContainerDied","Data":"46ec392561fcd2b377c433df08c45dc60e9a2f469268a1ca308bb44ae0fd25e0"} Jan 29 18:15:01 crc kubenswrapper[4886]: I0129 18:15:01.740550 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" event={"ID":"0dfd79b6-b491-45ff-9977-93e384a500a7","Type":"ContainerStarted","Data":"24ba151c4962237189e61a655c72587607c861741938cbac67db9cbfa17ad60d"} Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.253348 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.447552 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dfd79b6-b491-45ff-9977-93e384a500a7-config-volume\") pod \"0dfd79b6-b491-45ff-9977-93e384a500a7\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.447914 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dfd79b6-b491-45ff-9977-93e384a500a7-secret-volume\") pod \"0dfd79b6-b491-45ff-9977-93e384a500a7\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.448143 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58bxl\" (UniqueName: \"kubernetes.io/projected/0dfd79b6-b491-45ff-9977-93e384a500a7-kube-api-access-58bxl\") pod \"0dfd79b6-b491-45ff-9977-93e384a500a7\" (UID: \"0dfd79b6-b491-45ff-9977-93e384a500a7\") " Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.448833 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfd79b6-b491-45ff-9977-93e384a500a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "0dfd79b6-b491-45ff-9977-93e384a500a7" (UID: "0dfd79b6-b491-45ff-9977-93e384a500a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.452497 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dfd79b6-b491-45ff-9977-93e384a500a7-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.454927 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dfd79b6-b491-45ff-9977-93e384a500a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0dfd79b6-b491-45ff-9977-93e384a500a7" (UID: "0dfd79b6-b491-45ff-9977-93e384a500a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.457458 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dfd79b6-b491-45ff-9977-93e384a500a7-kube-api-access-58bxl" (OuterVolumeSpecName: "kube-api-access-58bxl") pod "0dfd79b6-b491-45ff-9977-93e384a500a7" (UID: "0dfd79b6-b491-45ff-9977-93e384a500a7"). InnerVolumeSpecName "kube-api-access-58bxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.555098 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58bxl\" (UniqueName: \"kubernetes.io/projected/0dfd79b6-b491-45ff-9977-93e384a500a7-kube-api-access-58bxl\") on node \"crc\" DevicePath \"\"" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.555377 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dfd79b6-b491-45ff-9977-93e384a500a7-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.615106 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:15:03 crc kubenswrapper[4886]: E0129 18:15:03.615491 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.768479 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" event={"ID":"0dfd79b6-b491-45ff-9977-93e384a500a7","Type":"ContainerDied","Data":"24ba151c4962237189e61a655c72587607c861741938cbac67db9cbfa17ad60d"} Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.768525 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24ba151c4962237189e61a655c72587607c861741938cbac67db9cbfa17ad60d" Jan 29 18:15:03 crc kubenswrapper[4886]: I0129 18:15:03.768578 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495175-dxl59" Jan 29 18:15:04 crc kubenswrapper[4886]: I0129 18:15:04.324043 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55"] Jan 29 18:15:04 crc kubenswrapper[4886]: I0129 18:15:04.334698 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495130-cdv55"] Jan 29 18:15:04 crc kubenswrapper[4886]: I0129 18:15:04.626351 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281" path="/var/lib/kubelet/pods/d0fd5fcc-58d6-4d14-a68b-0c10e4dc5281/volumes" Jan 29 18:15:17 crc kubenswrapper[4886]: I0129 18:15:17.615467 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:15:17 crc kubenswrapper[4886]: E0129 18:15:17.616427 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.009016 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8x4lm"] Jan 29 18:15:21 crc kubenswrapper[4886]: E0129 18:15:21.010499 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfd79b6-b491-45ff-9977-93e384a500a7" containerName="collect-profiles" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.010517 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfd79b6-b491-45ff-9977-93e384a500a7" containerName="collect-profiles" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.010772 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dfd79b6-b491-45ff-9977-93e384a500a7" containerName="collect-profiles" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.013054 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.022149 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8x4lm"] Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.075355 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-utilities\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.075448 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-catalog-content\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.075519 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n5zm\" (UniqueName: \"kubernetes.io/projected/c23f34ba-7d01-4793-85de-7d1cecfcfd89-kube-api-access-9n5zm\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.178245 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-utilities\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.178318 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-catalog-content\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.178390 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n5zm\" (UniqueName: \"kubernetes.io/projected/c23f34ba-7d01-4793-85de-7d1cecfcfd89-kube-api-access-9n5zm\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.178817 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-utilities\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.178904 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-catalog-content\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.205656 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n5zm\" (UniqueName: \"kubernetes.io/projected/c23f34ba-7d01-4793-85de-7d1cecfcfd89-kube-api-access-9n5zm\") pod \"redhat-operators-8x4lm\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.347484 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.868549 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8x4lm"] Jan 29 18:15:21 crc kubenswrapper[4886]: I0129 18:15:21.997573 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8x4lm" event={"ID":"c23f34ba-7d01-4793-85de-7d1cecfcfd89","Type":"ContainerStarted","Data":"3a7d98625554d3cac706804f00e5be38dcbb162664ca97b0c5839e0488d0d6e3"} Jan 29 18:15:23 crc kubenswrapper[4886]: I0129 18:15:23.015238 4886 generic.go:334] "Generic (PLEG): container finished" podID="c23f34ba-7d01-4793-85de-7d1cecfcfd89" containerID="159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72" exitCode=0 Jan 29 18:15:23 crc kubenswrapper[4886]: I0129 18:15:23.015473 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8x4lm" event={"ID":"c23f34ba-7d01-4793-85de-7d1cecfcfd89","Type":"ContainerDied","Data":"159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72"} Jan 29 18:15:24 crc kubenswrapper[4886]: I0129 18:15:24.034180 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8x4lm" event={"ID":"c23f34ba-7d01-4793-85de-7d1cecfcfd89","Type":"ContainerStarted","Data":"385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815"} Jan 29 18:15:29 crc kubenswrapper[4886]: I0129 18:15:29.093344 4886 generic.go:334] "Generic (PLEG): container finished" podID="c23f34ba-7d01-4793-85de-7d1cecfcfd89" containerID="385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815" exitCode=0 Jan 29 18:15:29 crc kubenswrapper[4886]: I0129 18:15:29.093391 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8x4lm" event={"ID":"c23f34ba-7d01-4793-85de-7d1cecfcfd89","Type":"ContainerDied","Data":"385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815"} Jan 29 18:15:30 crc kubenswrapper[4886]: I0129 18:15:30.112666 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8x4lm" event={"ID":"c23f34ba-7d01-4793-85de-7d1cecfcfd89","Type":"ContainerStarted","Data":"fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2"} Jan 29 18:15:30 crc kubenswrapper[4886]: I0129 18:15:30.150418 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8x4lm" podStartSLOduration=3.59326439 podStartE2EDuration="10.150391429s" podCreationTimestamp="2026-01-29 18:15:20 +0000 UTC" firstStartedPulling="2026-01-29 18:15:23.017621923 +0000 UTC m=+6805.926341205" lastFinishedPulling="2026-01-29 18:15:29.574748942 +0000 UTC m=+6812.483468244" observedRunningTime="2026-01-29 18:15:30.136039692 +0000 UTC m=+6813.044759004" watchObservedRunningTime="2026-01-29 18:15:30.150391429 +0000 UTC m=+6813.059110741" Jan 29 18:15:31 crc kubenswrapper[4886]: I0129 18:15:31.348110 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:31 crc kubenswrapper[4886]: I0129 18:15:31.348560 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:31 crc kubenswrapper[4886]: I0129 18:15:31.615048 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:15:31 crc kubenswrapper[4886]: E0129 18:15:31.615386 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:15:32 crc kubenswrapper[4886]: I0129 18:15:32.495030 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8x4lm" podUID="c23f34ba-7d01-4793-85de-7d1cecfcfd89" containerName="registry-server" probeResult="failure" output=< Jan 29 18:15:32 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Jan 29 18:15:32 crc kubenswrapper[4886]: > Jan 29 18:15:41 crc kubenswrapper[4886]: I0129 18:15:41.407775 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:41 crc kubenswrapper[4886]: I0129 18:15:41.478124 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:41 crc kubenswrapper[4886]: I0129 18:15:41.654206 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8x4lm"] Jan 29 18:15:43 crc kubenswrapper[4886]: I0129 18:15:43.282068 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8x4lm" podUID="c23f34ba-7d01-4793-85de-7d1cecfcfd89" containerName="registry-server" containerID="cri-o://fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2" gracePeriod=2 Jan 29 18:15:43 crc kubenswrapper[4886]: I0129 18:15:43.851755 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.002126 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-catalog-content\") pod \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.002500 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n5zm\" (UniqueName: \"kubernetes.io/projected/c23f34ba-7d01-4793-85de-7d1cecfcfd89-kube-api-access-9n5zm\") pod \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.002617 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-utilities\") pod \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\" (UID: \"c23f34ba-7d01-4793-85de-7d1cecfcfd89\") " Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.004392 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-utilities" (OuterVolumeSpecName: "utilities") pod "c23f34ba-7d01-4793-85de-7d1cecfcfd89" (UID: "c23f34ba-7d01-4793-85de-7d1cecfcfd89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.028674 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23f34ba-7d01-4793-85de-7d1cecfcfd89-kube-api-access-9n5zm" (OuterVolumeSpecName: "kube-api-access-9n5zm") pod "c23f34ba-7d01-4793-85de-7d1cecfcfd89" (UID: "c23f34ba-7d01-4793-85de-7d1cecfcfd89"). InnerVolumeSpecName "kube-api-access-9n5zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.105689 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.105720 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n5zm\" (UniqueName: \"kubernetes.io/projected/c23f34ba-7d01-4793-85de-7d1cecfcfd89-kube-api-access-9n5zm\") on node \"crc\" DevicePath \"\"" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.146782 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c23f34ba-7d01-4793-85de-7d1cecfcfd89" (UID: "c23f34ba-7d01-4793-85de-7d1cecfcfd89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.207884 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23f34ba-7d01-4793-85de-7d1cecfcfd89-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.296241 4886 generic.go:334] "Generic (PLEG): container finished" podID="c23f34ba-7d01-4793-85de-7d1cecfcfd89" containerID="fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2" exitCode=0 Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.296277 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8x4lm" event={"ID":"c23f34ba-7d01-4793-85de-7d1cecfcfd89","Type":"ContainerDied","Data":"fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2"} Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.296308 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8x4lm" event={"ID":"c23f34ba-7d01-4793-85de-7d1cecfcfd89","Type":"ContainerDied","Data":"3a7d98625554d3cac706804f00e5be38dcbb162664ca97b0c5839e0488d0d6e3"} Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.296343 4886 scope.go:117] "RemoveContainer" containerID="fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.296384 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8x4lm" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.351461 4886 scope.go:117] "RemoveContainer" containerID="385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.361861 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8x4lm"] Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.388525 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8x4lm"] Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.388851 4886 scope.go:117] "RemoveContainer" containerID="159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.441356 4886 scope.go:117] "RemoveContainer" containerID="fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2" Jan 29 18:15:44 crc kubenswrapper[4886]: E0129 18:15:44.441755 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2\": container with ID starting with fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2 not found: ID does not exist" containerID="fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.441783 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2"} err="failed to get container status \"fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2\": rpc error: code = NotFound desc = could not find container \"fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2\": container with ID starting with fabd372238950cdaa18aa497284d0f0db5e1eadfa5c30b8c1ca62969f03c7ae2 not found: ID does not exist" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.441801 4886 scope.go:117] "RemoveContainer" containerID="385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815" Jan 29 18:15:44 crc kubenswrapper[4886]: E0129 18:15:44.442313 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815\": container with ID starting with 385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815 not found: ID does not exist" containerID="385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.442343 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815"} err="failed to get container status \"385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815\": rpc error: code = NotFound desc = could not find container \"385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815\": container with ID starting with 385b78cd5b959e5ed39ba705ed1dbdacd0d3fe904a297951d7779db2ed426815 not found: ID does not exist" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.442355 4886 scope.go:117] "RemoveContainer" containerID="159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72" Jan 29 18:15:44 crc kubenswrapper[4886]: E0129 18:15:44.442949 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72\": container with ID starting with 159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72 not found: ID does not exist" containerID="159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.443014 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72"} err="failed to get container status \"159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72\": rpc error: code = NotFound desc = could not find container \"159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72\": container with ID starting with 159880fb7797e3eb5c8a3430fee5383494c98b722c378515cf30130e1a4baf72 not found: ID does not exist" Jan 29 18:15:44 crc kubenswrapper[4886]: I0129 18:15:44.627194 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23f34ba-7d01-4793-85de-7d1cecfcfd89" path="/var/lib/kubelet/pods/c23f34ba-7d01-4793-85de-7d1cecfcfd89/volumes" Jan 29 18:15:46 crc kubenswrapper[4886]: I0129 18:15:46.619997 4886 scope.go:117] "RemoveContainer" containerID="b900b9c884451219b68e72739d460e4d06900b18f10f7003c7040961c812bb7b" Jan 29 18:15:46 crc kubenswrapper[4886]: E0129 18:15:46.621142 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gx4vp_openshift-machine-config-operator(5a5d8fc0-7aa5-431a-9add-9bdcc6d20091)\"" pod="openshift-machine-config-operator/machine-config-daemon-gx4vp" podUID="5a5d8fc0-7aa5-431a-9add-9bdcc6d20091" Jan 29 18:15:57 crc kubenswrapper[4886]: I0129 18:15:57.189912 4886 scope.go:117] "RemoveContainer" containerID="e970dea6a6e8251fa9ff24484a3f5ffaee4ce0d2fad251a5d786e848db7373be" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515136721754024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015136721755017376 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015136704077016517 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015136704077015467 5ustar corecore